Carbon Emissions from Coal-Fired Power Plants

13 September, 2013

The 50 dirtiest electric power plants in the United States—all coal-fired—emit as much carbon dioxide as half of America’s 240 million cars.

The dirtiest 1% spew out a third of the carbon produced by US power plants.

And the 100 dirtiest plants—still a tiny fraction of the country’s 6,000 power plants—account for a fifth of all US carbon emissions.

According to this report, curbing the emissions of these worst offenders would be one of the best ways to cut US carbon emissions, reducing the risk that emissions will trigger dangerous climate change:

• Environment America Research and Policy Center, America’s dirtiest power plants: their oversized contribution to global warming and what we can do about it, 2013.

Some states in the US already limit carbon pollution from power plants. At the start of this year, California imposed a cap on carbon dioxide emissions, and in 2014 it will link with Quebec’s carbon market. Nine states from Maine to Maryland participate in the Regional Greenhouse Gas Initiative (RGGI), which caps emissions from power plants in the Northeast.

At the federal level, a big step forward was the 2007 Supreme Court decision saying the Environmental Protection Agency should develop plans to regulate carbon emissions. The EPA is now getting ready to impose carbon emission limits for all new power plants in the US. But some of the largest sources of carbon dioxide are existing power plants, so getting them to shape up or shut down could have big benefits.

What to do?

Here’s what the report suggests:

• The Obama Administration should set strong limits on carbon dioxide pollution from new power plants to prevent the construction of a new generation of dirty power plants, and force existing power plants to clean up by setting strong limits on carbon dioxide emissions from all existing power plants.

• New plants – The Environmental Protection Agency (EPA) should work to meet its September 2013 deadline for re-proposing a stringent emissions standard for new power plants. It should also set a deadline for finalizing these standards no later than June 2015.

• Existing plants – The EPA should work to meet the timeline put forth by President Obama for proposing and finalizing emissions standards for existing power plants. This timeline calls for limits on existing plants to be proposed by June 2014 and finalized by June 2015. The standards should be based on the most recent climate science and designed to achieve the emissions reduction targets that are necessary to avoid the worst impacts of global warming.

In addition to cutting pollution from power plants, the United States should adopt a suite of clean energy policies at the local, state, and federal levels to curb emissions of carbon dioxide from energy use in other sectors.

In particular, the United States should prioritize establishing a comprehensive, national plan to reduce carbon pollution from all sources – including transportation, industrial activities, and the commercial and residential sectors.

Other policies to curb emissions include:

• Retrofitting three-quarters of America’s homes and businesses for improved energy efficiency, and implementing strong building energy codes to dramatically reduce fossil fuel consumption in new homes and businesses.

• Adopting a federal renewable electricity standard that calls for 25 percent of America’s electricity to come from clean, renewable sources by 2025.

• Strengthening and implementing state energy efficiency resource standards that require utilities to deliver energy efficiency improvements in homes, businesses and industries.

• Installing more than 200 gigawatts of solar panels and other forms of distributed renewable energy at residential, commercial and industrial buildings over the next two decades.

• Encouraging the use of energy-saving combined heat-and-power systems in industry.

• Facilitating the deployment of millions of plug-in vehicles that operate partly or solely on electricity, and adopting clean fuel standards that require a reduction in the carbon intensity of transportation fuels.

• Ensuring that the majority of new residential and commercial development in metropolitan areas takes place in compact, walkable communities with access to a range of transportation options.

• Expanding public transportation service to double ridership by 2030, encouraging further ridership increases through better transit service, and reducing per-mile global warming pollution from transit vehicles. The U.S. should also build high-speed rail lines in 11 high-priority corridors by 2030.

• Strengthening and expanding the Regional Greenhouse Gas Initiative, which limits carbon dioxide pollution from power plants in nine northeastern state, and implementing California’s Global Warming Solutions Act (AB32), which places an economy-wide cap on the state’s greenhouse gas emissions.

Carbon emitted per power produced

An appendix to this report list the power plants that emit the most carbon dioxide by name, along with estimates of their emissions. That’s great! But annoyingly, they do not seem to list the amounts of energy per year produced by these plants.

If carbon emissions were strictly proportional to the amount of energy produced, that would tend to undercut the the notion that the biggest carbon emitters are especially naughty. But in fact there’s a lot of variability in the amount of carbon emitted per energy generated. You can see that in this chart of theirs:

So, it would be good to see a list of the worst power plants in terms of CO2 emitted per energy generated.

The people who prepared this report could probably create such a list without much extra work, since they write:

We obtained fuel consumption and electricity generation data for power plants operating in the United States from the U.S. Department of Energy’s Energy Information Administration (EIA) 2011 December EIA-923 Monthly Time Series.


Why Most Published Research Findings Are False

11 September, 2013

My title here is the eye-catching—but exaggerated!—-title of this well-known paper:

• John P. A. Ioannidis, Why most published research findings are false, PLoS Medicine 2 (2005), e124.

It’s open-access, so go ahead and read it! Here is his bold claim:

Published research findings are sometimes refuted by subsequent evidence, with ensuing confusion and disappointment. Refutation and controversy is seen across the range of research designs, from clinical trials and traditional epidemiological studies to the most modern molecular research. There is increasing concern that in modern research, false findings may be the majority or even the vast majority of published research claims. However, this should not be surprising. It can be proven that most claimed research findings are false. Here I will examine the key factors that influence this problem and some corollaries thereof.

He’s not really talking about all ‘research findings’, just research that uses the

ill-founded strategy of claiming conclusive research findings solely on the basis of a single study assessed by formal statistical significance, typically for a p-value less than 0.05.

His main interests are medicine and biology, but many of the problems he discusses are more general.

His paper is a bit technical—but luckily, one of the main points was nicely explained in the comic strip xkcd:


If you try 20 or more things, you should not be surprised that once an event with probability less than 0.05 = 1/20 will happen! It’s nothing to write home about… and nothing to write a scientific paper about.

Even researchers who don’t make this mistake deliberately can do it accidentally. Ioannidis draws several conclusions, which he calls corollaries:

Corollary 1: The smaller the studies, the less likely the research findings are to be true. (If you test just a few jelly beans to see which ones ‘cause acne’, you can easily fool yourself.)

Corollary 2: The smaller the effects being measured, the less likely the research findings are to be true. (If you’re studying whether jelly beans cause just a tiny bit of acne, you you can easily fool yourself.)

Corollary 3: The more quantities there are to find relationships between, the less likely the research findings are to be true. (If you’re studying whether hundreds of colors of jelly beans cause hundreds of different diseases, you can easily fool yourself.)

Corollary 4: The greater the flexibility in designing studies, the less likely the research findings are to be true. (If you use lots and lots of different tricks to see if different colors of jelly beans ‘cause acne’, you can easily fool yourself.)

Corollary 5: The more financial and other interests and prejudices in a scientific field, the less likely the research findings are to be true. (If there’s huge money to be made selling acne-preventing jelly beans to teenagers, you can easily fool yourself.)

Corollary 6: The hotter a scientific field, and the more scientific teams involved, the less likely the research findings are to be true. (If lots of scientists are eagerly doing experiments to find colors of jelly beans that prevent acne, it’s easy for someone to fool themselves… and everyone else.)

Ioannidis states his corollaries in more detail; I’ve simplified them to make them easy to understand, but if you care about this stuff, you should read what he actually says!

The Open Science Framework

Since his paper came out—and many others on this general theme—people have gotten more serious about improving the quality of statistical studies. One effort is the Open Science Framework.

Here’s what their website says:

The Open Science Framework (OSF) is part network of research materials, part version control system, and part collaboration software. The purpose of the software is to support the scientist’s workflow and help increase the alignment between scientific values and scientific practices.

Document and archive studies.

Move the organization and management of study materials from the desktop into the cloud. Labs can organize, share, and archive study materials among team members. Web-based project management reduces the likelihood of losing study materials due to computer malfunction, changing personnel, or just forgetting where you put the damn thing.

Share and find materials.

With a click, make study materials public so that other researchers can find, use and cite them. Find materials by other researchers to avoid reinventing something that already exists.

Detail individual contribution.

Assign citable, contributor credit to any research material – tools, analysis scripts, methods, measures, data.

Increase transparency.

Make as much of the scientific workflow public as desired – as it is developed or after publication of reports. Find public projects here.

Registration.

Registering materials can certify what was done in advance of data analysis, or confirm the exact state of the project at important points of the lifecycle such as manuscript submission or at the onset of data collection. Discover public registrations here.

Manage scientific workflow.

A structured, flexible system can provide efficiency gain to workflow and clarity to project objectives, as pictured.

CONSORT

Another group trying to improve the quality of scientific research is CONSORT, which stands for Consolidated Standards of Reporting Trials. This is mainly aimed at medicine, but it’s more broadly applicable.

The key here is the “CONSORT Statement”, a 25-point checklist saying what you should have in any paper about a randomized controlled trial, and a flow chart saying a bit about how the experiment should work.

What else?

What are the biggest other efforts that are being made to improve the quality of scientific research?


Azimuth Blog Overview

6 September, 2013

We’ve got lots of series of articles on this blog. Some people say it’s a bit overwhelming. So David Tanzer of the Azimuth Project had a good idea: create an organized list of the articles on this blog, to make them easier to find. Here it is:

Azimuth Blog overview.

You can also find a link to this on top of the “ALSO READ THESE” list at the right-hand side of this blog!

Needless to say, this could be improved in many ways. Don’t say how: just do it!


What To Do? (Part 2)

28 August, 2013

Dear John,

If you could do anything to change the world what would you do? Many people haven’t had the opportunity to ponder that question because they have been busy studying what could be possible within a particular set of resource constraints. However, what if we push the limits? If all the barriers were removed, then what would you do?

The XXXXXXXXX Foundation has an open, aggressive, and entrepreneurial approach to philanthropy. Our goal is to produce substantial, widespread and lasting changes to society that will maximize opportunity and minimize injustice. We tap into the minds of fearless thinkers who have big, bold, transformational ideas, and work with them to invest in strategies designed to solve persistent problems.

Our team is reaching out to you because we believe you are the type of innovative thinker with ideas that just might change the world. While this is not a promise of grant funding, it is an invitation to share your ideas. You can learn more about the XXXXXXXXX Foundation by visiting our website. Thank you for your interest and I look forward to hearing your ideas.

Sincerely,
ZZZZZZZ


I got this email yesterday. While I have some ideas, I really want to make the most of this chance. So: what would you do if you got this sort of opportunity? To keep things simple, let’s assume this is a legitimate email from a truly well-meaning organization—I’m checking that, and it seems to be so. Assume they could spend 1-10 million dollars on a really good idea, and assume you really want to help the world. What idea would you suggest?

Some ideas

Here are some comments from G+ to get your thoughts going. Heather Vandagriff wrote:

Hard core grassroots organization toward political involvement and education on climate issues. 

Jason Holt wrote:

Ideas are cheap.
http://www.pretotyping.org/the-pretotyping-manifesto-1/innovators-beat-ideas

Borislav Iordanov wrote:

I don’t agree that ideas are cheap. It could take a lifetime to have a really good one. However, one could argue that really good ideas are probably already funded. But if to maximize opportunity and to minimize injustice is the motivation, I say government transparency should be top priority. I can google the answer to almost any technical or scientific question, any historical fact, or pop culture, you name it. But I can’t know what my government is doing. And I’m not talking only, or even mostly, about things that governments hide. I’m talking about mundane day-to-day operations that are potentially not conducted in the best interest of the people, knowingly or unknowingly. I can easily find what are the upcoming concerts or movies, but it’s much harder to find out what, for instance, my local government is currently discussing so I can perhaps stop by the commissioner chamber and have my voice being heard (why aren’t there TV commercials about the public hearing of the next city ordinance?).

I realize this is not a concrete idea, but there are plenty of projects in that direction around the internet. And I don’t think such projects should come only from within government agencies because there is a conflict of interest.

Bottom line is that any sustainable, permanent change towards a better society has to involve the political process in some way, and the best (peaceful!) way to enact change there starts with real and consequential openness. Didn’t expect to write so much, sorry…


John Baez wrote:

Borislav Iordanov wrote:

But if to maximize opportunity and to minimize injustice is the motivation, I say government transparency should be top priority.

That’s a great idea… and in fact, this foundation already has a project to promote government transparency. So, I’ll either need to come up with a specific way to promote it that they haven’t thought about, or come up with something else.

Noon Silk wrote:

I guess the easy answer is some sort of education program; educating people in some way so-as to generate the skills necessary to do the thing that you really want to do. So I don’t know. Perhaps part of it could be some sort campaign to get a few coursera et al courses on climate maths, etc, and building some sort of innovative and exciting program around that.

Richard Lucas wrote:

Use existing corporate law (thanks, Capitalists!) to create collectives (maybe non-profits?) into which people could elect to participate. Participation would be contingent upon adoption of a certain set of standards for behavior impossible in the broader, geographical society in which we are immersed. Participants would enjoy a guaranteed minimum income, health care, etc – the goals of Communism, but in a limited scope, applied to participants who also exist in the general society. It’s just that participants would agree to share time, resources, and expertise with the collective. If collective living can’t be made to work in such an environment, where participation could be relatively selective up front, to include the honest and the committed…. well, then it can’t work. When the right formula is established, and the standard of living for participants is greater than for peers who are not “participants”, then you can expect more people to join. A tipping point would eventually be reached, where the majority of citizens in the broader, geographical society were also participants in an optional, voluntary, supersociety which does not respect geographic or national boundaries.

This is the only way it will work, and the beauty is that Communists and Objectivists equally hate this idea, because it breaks their frames, and because it is legal, and because if the larger society tried to block it, they would then have to justify the ability of crazy UFO cults and religions to do it. So, it can’t be stopped. There’s no theory to defend. You just do it.

Xah Lee wrote:

put the $10M to increase internet penetration, or in other ways enhance communication such as cell phone.

absolutely do not do anything that’s normally considered as good or helpful to humanity. such as help africa, women, the weak, the cripple, poor, vaccine, donation, institutionalized education etc.

even though, i’m still doubtful there’d be any improvement of humanity. $10M is like a peanut for this. One missile is $10M… 

John Baez wrote:

Xah Lee wrote:

even though, i’m still doubtful there’d be any improvement of humanity. $10M is like a peanut for this.

There are certain activities where the benefit is roughly proportional to the amount of money spent – like, giving people bed-netting that repels malaria-carrying mosquitos, or buying textbooks for students. For such activities, $10 million is often not enough to get the job done.

But there are other activities where $10 million is the difference between some good self-perpetuating phenomenon starting to happen, and it not starting to happen. This is the kind of thing I should suggest.

It’s the difference between pushing a large rock up a long hill, and starting a rock rolling down a hill.

By the way, this foundation plans to spend a lot more than $10M in total. I just want to suggest a project that will seem cheap to them, to increase the chance that they actually pursue it.

Piotr Migdal wrote:

I think that the thing that needs a drastic change in the education system. I suggest founding a “hacker university” (or “un-university”).

The educational system was designed for preparation of soldiers and factory workers. Now the job market is very different, and one cannot hope to work in one factory for his/her lifetime. Additionally, the optimal skill set is not necessarily the same for everyone (and it changes, as the World changes). But the worst thing is that schools teaches that “take no initiative, just obey” which stops working once one needs to find a job. Plus, for more creative tasks usually the top-down approach is the worst one (contrasting with the coordination tasks).

While changing the whole system may be hard, let’s think about universities; or a… un-university. Instead of attending predefined classes, let’s do the following:
• based on self-learning,
• lectures are because someone is willing to give them,
• everything voluntary (e.g. lectures and classes),
• own projects highly encouraged, starting from day one.

So basically, a collection of people who actually want to learn (!= earn a degree / prestige / position / fame), perform research which they consider the most fascinating (not merely doing science which is currently most fashionable and well-funded or “my advisor/grant/dean told so”) and undertake projects for greater good (startup-like freedom (unexperienceable in the current academia, at least – for the young) for things not necessarily giving monetary profit).

Sure, you may argue that there are more important goals (unemployment, bureaucracy, poverty, wars, ongoing destruction of natural environment – to name only a few in no particular order). But this one can be a nucleus for solving many other problems – wider in education and in general. And such a spark may yield in an unimaginable firestorm (a bad metaphor, it has to be about creation) seed can grow, flourish and make deserts blossom.

EDIT:

By founding I don’t mean paying for administration. Quite opposite – just rent a building, nothing more (so no tuition and no renting cost for students, to make it accessible regardless of the background). Almost all stuff (e.g. admission) in the first years based entirely on voluntary work.

John Baez wrote:

Noon Silk wrote: “I guess the easy answer is some sort of education program…”

That sounds good. The foundation already has a program to improve K-12 education in the United States. So, when it comes to education, I’d either need to give them ideas they haven’t tried in that realm, get them interested in education outside the US, or get them interested in post-secondary education. Piotr Migdal’s idea of a ‘hacker university’ might be one approach. It also seems the potential of free worldwide online courses has not yet been fully exploited. 

Piotr Migdal wrote:

The point is in going well beyond online courses (which, IMHO, are nice but not that revolutionary – textbooks are there for quite a few years; I consider things like Usenet, Wikipedia and StackExchange way more impactful for education) – by gathering a critical mass of intelligent and passionate people. But anyway, it may be the right time for innovations in education (and not only small patches).

Robert Byrne wrote:

Firstly, thanks for sharing this John! Secondly, congratulations on being chosen!

I would look into three aspects of this. 1) Who funds it, and whether you are comfortable with that, 2) do they choose candidates and generally have processes that make use of the experience of similar organizations such as MacArthur?, 3) what limits are there on using the grant — could you design your own prize to solve a problem using these funds?

But you’ve asked for ideas. The biggest problems that can be fixed/improved for $5 million! I’ll stick to education and technology. Here are some areas:
• Education reform in the U.S., think-tanks or writers who can create a model to switch away from municipal public education funding, with the aim of reducing disadvantage,
• Office, factory and home power efficiency technology, anything that needs $1 million to get to prototype,
• Solve the commute/car problem — e.g. how can more people work within the suburb in which they live? How can public transit be useful in sprawling suburbs?

John Baez wrote:

Robert Byrne wrote:

Firstly, thanks for sharing this John! Secondly, congratulations on being chosen!

Thanks! I’ve been chosen to give them ideas.

“I would look into three aspects of this. 1) Who funds it, and whether you are comfortable with that, 2) do they choose candidates and generally have processes that make use of the experience of similar organizations such as MacArthur?, 3) what limits are there on using the grant — could you design your own prize to solve a problem using these funds?”

Thanks – I definitely plant to look the gift horse in the mouth. They didn’t say anything about giving me a grant, except to say “this is not a promise of a grant”.

So, right now I’m treating this as an exercise in coming up with a really good idea that I’m happy to give away and let someone try. Naturally there’s a self-serving part of me that wants to pick an idea where my participation would be required. But knowing me, I’ll actually be happiest if I can catalyze something good in a limited amount of time and then think about other things.

“Solve the commute/car problem — e.g. how can more people work within the suburb in which they live? How can public transit be useful in sprawling suburbs?”

My wife Lisa raised this one. I would love to do something about this. But what can be done for just 1-10 million dollars? To do something good in this field with that amount of money, it seems we’d need to have a really smart idea: something where a small change can initiate some sort of chain reaction. Any specific ideas?

And so on…

In some ways this post is a followup to What To Do (Part 1), so if you haven’t read that, you might want to now.


The Elitzur–Vaidman Bomb-Testing Method

24 August, 2013

Quantum mechanics forces us to refine our attitude to counterfactual conditionals: questions about what would have happened if we had done something, even though we didn’t.

“What would the position of the particle be if I’d measured that… when actually I measured its momentum?” Here you’ll usually get no definite answer.

But sometimes you can use quantum mechanics to find out what would have happened if you’d done something… when classically it seems impossible!

Suppose you have a bunch of bombs. Some have a sensor that will absorb a photon you shine on it, and make the bomb explode! Others have a broken sensor, that won’t interact with the photon at all.

Can you choose some working bombs? You can tell if a bomb works by shining a photon on it. But if it works, it blows up—and then it doesn’t work anymore!

So, it sounds impossible. But with quantum mechanics you can do it. You can find some bombs that would have exploded if you had shone photons at them!

Here’s how:

Put a light that emits a single photon at A. Have the photon hit the half-silvered mirror at lower left, so it has a 50% chance of going through to the right, and a 50% chance of reflecting and going up. But in quantum mechanics, it sort of does both!

Put a bomb at B. Recombine the photon’s paths using two more mirrors. Have the two paths meet at a second half-silvered mirror at upper right. You can make it so that if the bomb doesn’t work, the photon interferes with itself and definitely goes to C, not D.

But if the bomb works, it absorbs the photon and explodes unless the photon takes the top route… in which case, when it hits the second half-silvered mirror, it has a 50% chance of going to C and a 50% chance of going to D.

So:

• If the bomb doesn’t work, the photon has a 100% chance of going to C.

• If the bomb works, there’s a 50% chance that it absorbs the photon and explodes. There’s also a 50% chance that the bomb does not explode—and then the photon is equally likely to go to either C or D. So, the photon has a 25% chance of reaching C and a 25% chance of reaching D.

So: if you see a photon at D, you know you have a working bomb… but the bomb has not exploded!

For each working bomb there’s:

• a 50% chance that it explodes,
• a 25% chance that it doesn’t explode but you can’t tell if it works,
• a 25% chance that it doesn’t explode but you can tell that it works.

This is the Elitzur–Vaidman bomb-testing method. It was invented by Avshalom Elitzur and Lev Vaidman in 1993. One year later, physicists actually did an experiment to show this idea works… but alas, not using actual bombs!

In 1996, Kwiat showed that using more clever methods, you can reduce the percentage of wasted working bombs as close to zero as you like. And pushing the idea even further, Graeme Mitchison and Richard Jozsa showed in 1999 that you can get a quantum computer to do a calculation for you without even turning it on!

This sounds amazing, but it’s really no more amazing than the bomb-testing method I’ve already described.

References

For details, read these:

• A. Elitzur and L. Vaidman, Quantum mechanical interaction-free measurements, Found. Phys. 23 (1993), 987–997.

• Paul G. Kwiat, H. Weinfurter, T. Herzog, A. Zeilinger, and M. Kasevich, Experimental realization of “interaction-free” measurements.

• Paul G. Kwiat, Interaction-free measurements.

• Graeme Mitchison and Richard Jozsa, Counterfactual computation, Proc. Roy. Soc. Lond. A457 (2001), 1175–1194.

The picture is from the Wikipedia article, which also has other references:

Elitzur–Vaidman bomb tester, Wikipedia.

Bas Spitters pointed out this category-theoretic analysis of the issue:

• Robert Furber and Bart Jacobs, Towards a categorical account of conditional probability.


The North Pole Was, Briefly, a Lake

22 August, 2013


It happened over a month ago. The picture above was taken on 22 July 2013. It shows a buoy anchored near a remote webcam at the North Pole, surrounded by a lake of melted ice:

• Becky Oskin, North Pole now a lake, LiveScience, 23 July 2013.

Instead of snow and ice whirling on the wind, a foot-deep aquamarine lake now sloshes around a webcam stationed at the North Pole. The meltwater lake started forming July 13, following two weeks of warm weather in the high Arctic. In early July, temperatures were 2 to 5 degrees Fahrenheit (1 to 3 degrees Celsius) higher than average over much of the Arctic Ocean, according to the National Snow & Ice Data Center.

Meltwater ponds sprout more easily on young, thin ice, which now accounts for more than half of the Arctic’s sea ice. The ponds link up across the smooth surface of the ice, creating a network that traps heat from the sun. Thick and wrinkly multi-year ice, which has survived more than one freeze-thaw season, is less likely sport a polka-dot network of ponds because of its rough, uneven surface.

This particular meltwater pond was “just over 2 feet deep and a few hundred feet wide”, according to this article:

• Hannah Hickey, Santa’s workshop not flooded—but lots of melting in the Arctic, 30 July 2013.

The pond drained out through cracks in the ice late July 27.

More important is the overall trend in the the total sea ice volume as estimated by the Pan-Arctic Ice Ocean Modeling and Assimilation System (PIOMAS).

The trend line from 1979 to 2011 shows that Arctic sea ice is melting at an average rate of roughly 3,000 cubic kilometers per decade.

In 2010, 2011 and 2012, so much ice melted that the volume fell more than 2 standard deviations below from the trend line—that’s why the jagged curve falls below the shaded region at the far right of the graph. At the end of July this year, it was just about 2 standard deviations below the trend line. The ice volume seems unlikely to break last year’s record low.

As usual, click the picture for more details.


To Really Judge Good Music, Turn Off the Sound

21 August, 2013

In a study published in the Proceedings of the National Academy of Sciences, Chia-Jung Tsay showed musicians clips from classical music competitions. She asked them to guess the winners. Different musicians were given different kinds of clips: audio recordings… videos with sound… and videos with no sound!

They did best when they saw videos with no sound.

It’s not that the winners were good-looking. It’s that they moved in expressive ways.

This reminds me of how people are willing to pay more for wines whose names are hard to pronounce… or how you can predict who will win an election by watching videos of the candidates—with the sound off.

I think there’s something important about these results. They’re a bit depressing: if our cognitive apparatus is so deeply flawed, maybe we’re doomed. But maybe it’s not so bad. Maybe we just tend to define the point of various activities too narrowly!

We go to a concert, not just to listen to music, but to watch the performer. We drink a wine, not just to taste it in our mouth, but to bask in the sense of being sophisticated. We judge a candidate, not just by what they say, but by how they say it.

You can see the paper here:

• Chia-Jung Tsay, Sight over sound in the judgment of music performance, Proceedings of the National Academy of Sciences, 19 August 2013.

You can hear (or see) a nice summary and discussion here:

• Shankar Vedantam, How to win that music competition? Send a video, National Public Radio, 20 August 2013.


Follow

Get every new post delivered to your Inbox.

Join 3,095 other followers