Pages

Friday, December 23, 2011

Art, Authenticity, and Rationality

Most people can't recognize the difference between a genuine and fake painting, yet Oxford University scientists have discovered that people find more pleasure in viewing a work of art when they are told they are looking at an original painting rather than a fake. More from Science Daily:
Professor Martin Kemp, Emeritus Professor of the History of Art at Oxford University, said: "Our findings support what art historians, critics and the general public have long believed -- that it is always better to think we are seeing the genuine article. Our study shows that the way we view art is not rational, that even when we cannot distinguish between two works, the knowledge that one was painted by a renowned artist makes us respond to it very differently. The fact that people travel to galleries around the world to see an original painting suggests that this conclusion is reasonable."

When a participant was told that a work was genuine, it raised activity in the part of the brain that deals with rewarding events, such as tasting pleasant food or winning a gamble. Being told a work is not by the master triggered a complex set of responses in areas of the brain involved in planning new strategies. Participants reported that when viewing a supposed fake, they tried to work out why the experts regarded it not to be genuine.

Andrew Parker, Professor of Physiology at Oxford University and the study's senior author, said: "Our findings support the idea that when we make aesthetic judgements, we are subject to a variety of influences. Not all of these are immediately articulated. Indeed, some may be inaccessible to direct introspection but their presence might be revealed by brain imaging. It suggests that different regions of the brain interact together when a complex judgment is formed, rather than there being a single area of the brain that deals with aesthetic judgements."

The Accuracy of Political Experts

Wharton professor Philip Tetlock on political experts:
The finding that people found most surprising from my work on expert judgement, and in particular the work on expert political judgement, was that there was somewhat of an inverse relationship between how accurate experts were and how famous they were, that the media tended to be drawn to experts who offered very short and sharp soundbites. And those tended to be experts who, in my study, were somewhat less accurate.
From this short interview:


An important reminder as the American presidential race heats up!

What will historians say about us?


Big Think recently launched The Floating University and has started posting excerpts from its first online course, Great Big Ideas. The site recently featured an excerpt from Larry Summers' lecture, "The Authority of Ideas: Decoding the DNA of Education in Search of Actual Knowledge." In the excerpt, Summers addresses the question, What will historians say about our time 100 years from now? His response—that historians of tomorrow will be shocked by our treatment of the poor, just as we are shocked by slavery and other injustices of the past—reflects his central argument, as summarized by the editors of Big Think, that "we are moving from a world governed by the idea of authority to a world governed by the authority of ideas. He sees history as progressive on a macro-level: a series of cultural advances leading—slowly, and with many interruptions—toward increased human empathy and collective understanding." The short video excerpt of Summers' lecture is worth watching.

On Historical Distance

In an essay published by Tablet Magazine earlier this week, University of Houston history professor Robert Zaretsky presents a reflection on his professional work as a "Holocaust expert." The depth of his introspection—really, his willingness to reveal his innermost doubts about his role as a professional historian of the Holocaust—makes the essay a worthwhile read. The essay also raises some interesting questions about the authority of professional historians vis-à-vis eyewitnesses. Zaretsky writes:
Normally, being "there" is not an issue for a historian. Only a lunatic would repudiate an account of, say, the fall of the Bastille or Battle of Marathon because the historian had been born one or one hundred generations too late to savor the sulfur or participate in a phalanx. In fact, historians have long assumed that not being there is a professional advantage. In an odd phenomenological twist, we have always claimed that the distance provided by time and space, along with the accumulation of documents and data, permits us to know the past even better than did an event’s contemporaries, who were stuck in the chaos as they happened. Anyone can make history, but it takes a historian to understand it.
He then goes on to discuss why some contend that the benefits of historical distance may not apply to the Holocaust—and how he's wrestled with this issue as an American Jew and historian.

Wednesday, December 21, 2011

NYC Subway: On Waste and Wasted Money

Last night, I ventured into Manhattan to enjoy a meal with a friend in town for the holidays. While waiting for the subway, I noticed an emotionally intelligent sign on a trashcan:

Penn Station Uptown C waiting area, NYC
Playing on both the subway motif and the dictum "The Buck Stops Here," the sign is clever, convincing, and empowering—instilling a sense of pride in those who take their civic responsibility seriously. I've written before about emotionally intelligent signs and continue to enjoy the signs posted by Daniel Pink on his blog, but this is the first time I've spotted such a great example in the field.

On a related note, as I prepared to board the subway, I noticed that I had two old Metrocards in my wallet, and when I checked the balance on each, I discovered that both had just under fifty-cents remaining. (A one-way subway ride in New York City costs $2.25, and riders use a declining balance swipe card, a Metrocard, to pair the fare.) While I refilled one of the cards—and was thus able to utilize the balance on it—the other card seemed like a burden. I didn't want to carry it around in my wallet since I only ride the subway occasionally, yet I didn't want to throw it away. (And, unfortunately, the Metrocard vending machines don't allow riders to pull a partial balance from one card and apply it to another.)

This evening, I came across a fantastic idea (via The Atlantic's Alexis Madrigal) for what someone like me could do with the remaining balance on a Metrocard. A group of student social entrepreneurs have developed MetroChange, a product that could allow subway riders to donate card balances to charity. Here's a video detailing their product:


As an aside, I wonder the extent to which the MTA benefits from subway riders tossing away cards with small balances—just as retailers benefit from unredeemed gift cards. Might widespread use of a device like MetroChange result in a fare hike?

Sunday, December 18, 2011

Secret Life of Pronouns

Several weeks ago, students in one of my sections of Theory of Knowledge engaged in a spirited debate about the extent to which the words we use reflect our experiences, biases, upbringing, culture, and so forth. Some students posited that every word we utter is a reflection of ourselves—even answers to simple questions (e.g. What color are the walls in this room?)—while others argued that we can speak with neutrality, at least in some cases.

I was reminded of this debate today when I came across an interview with James W. Pennebaker, psychology professor and author of The Secret Life of Pronouns: What Our Words Say About Us, on the Harvard Business School website. The interview focuses (quite naturally) on the application of Pennebaker's research to business. Here's his take on what one can tell about a job applicant from a careful analysis of the words s/he uses during an interview:
It’s almost impossible to hear the differences naturally, which is why we use transcripts and computer analysis. Take a person who’s depressed."I" might make up 6.5% of his words, versus 4% for a nondepressed person. That’s a huge difference statistically, but our ears can’t pick it up. But hypothetically, if I were to listen to an interview, I might consider how the candidate talks about their coworkers at their last job. Do they refer to them as "we" or "they"? That gives you a sense of their relationship to the group. And if you want someone who’s really decisive in a position, a person who says "It’s hot" rather than "I think it’s hot" may be a better fit.
I haven't read Pennebaker's book, but I will add it to my reading list. As an aside, the article also includes this cool graphic—which may or may not be a word cloud—of the twenty most commonly used words in English:

Thursday, December 15, 2011

"In Service"

This past summer I wrote an article for Outreach, Newark Academy's magazine for alumni and friends, about the Academy during the Second World War. My research gave me the opportunity to dig through the school's archives and to interview many alumni about their experiences during and after the war era. The article has now been published. I encourage you to check it out.

Tuesday, December 13, 2011

The Comedy of Hubris

Today, an email from a colleague included this one-liner attributed to George Carlin:
Have you ever noticed that, when you're driving, anyone driving slower than you is an "idiot" and anyone driving faster than you is a "maniac?"
It reminds me of the Dunning-Kruger effect.

Monday, December 12, 2011

Cross-sensory Perception

It's not surprising that scientists are discovering the great extent to which our senses act in concert to help us make sense of the world. This weekend's Boston Globe Ideas section included an article that reviewed some recent research into cross-sensory perception. Some findings of note:
His [Oxford University researcher Charles Spence] recent work on the psychology of flavor perception, for instance, has shown that the flavor of your food is influenced by touch, vision, and even sound. A study from his lab a few years ago showed that people rate potato chips as crisper and better-tasting when a louder crunch is played back over headphones as they eat. 
A study published this year showed that people thought a strawberry mousse tasted sweeter, more intense, and better when they ate it off a white plate rather than a black plate. 
In some cases they [senses] compete with each other and one wins out (as your eyes win over your ears in the movies). In others, the information merges into something new; when people watch a video of a person saying “ga” while the audio is dubbed with a voice saying “ba,” they hear an intermediate “da.” 
I tried this last one with a group of students this morning, and while our experiment didn't yield the same results as the professional researcher, we were all quite excited to consider the complexities of our senses and the applications of this research to a variety of fields and professions, including cooking, marketing, and art.

Sunday, December 11, 2011

Hollywood's Constructed Languages

The New York Times today featured an article detailing the rise of constructed languages in Hollywood's recent sci-fi movies. The article is worth reading in its entirety, and the clips of spoken Dothraki, a language invented for an HBO series, applied to life in New York City are pretty cool. Here's an excerpt from the article that connects recent developments in Hollywood to the history of constructed languages:
There have been many attempts to create languages, often for specific political effect. In the 1870s, a Polish doctor invented Esperanto, meant to be a simplified international language that would bring world peace. Suzette Haden Elgin created Láaden as a language better suited for expressing women’s points of view. (Láaden has a single word, “bala,” that means “I’m angry for a reason but nothing can be done about it.”)

But none of the hundreds of languages created for social reasons developed as ardent a following as those created for movies, television and books, says Arika Okrent, author of “In the Land of Invented Languages.”

“For years people have been trying to engineer better languages and haven’t succeeded as well as the current era of language for entertainment sake alone,” Ms. Okrent said.

Friday, November 11, 2011

The Power of Language and Emotion

While preparing for a class session on the power of language, I came across this video, which speaks directly to the connection between language and emotion as ways of knowing:



The video was made by The sky is the limit, which describes itself as "a global effort designed to influence, inspire and motivate people from all walks of life." They have a ton of fascinating material that I hope to explore at some point.

Monday, November 7, 2011

The Languages We Tweet

I recently discovered Big Think, a website which describes itself as a "forum where top experts explore the big ideas and core skills defining the 21st century." Along with Arts and Letters Daily, it's quickly becoming one of my go-to sites for intellectual engagement. It's also a fantastic source for ideas and articles relevant to Theory of Knowledge.

A post from today on Big Think's Strange Maps blog features dazzling geolinguistic maps that mark the languages people around the world tweeted over the course of several months in 2011. Here's the map of Europe. Each color represents a different language:


The blog's author, map enthusiast Frank Jacobs, notes that the data visualizations tell us a lot about technology, society, and language. Here are some of his observations:
Western Europe is lit up like a christmas tree - with the Netherlands glowing especially bright. Eastern Europe: not so much. Russia is a spider’s web of large cities connected through the darkness of the vast, empty countryside.
 and
The fun really begins in Europe, where some of countries just vanish off the map: Belgium tweets in Dutch and French, Switzerland in mainly in German, with a French bit west of the Röstigraben. And other countries emerge out of nowhere: Catalans twitter in their own language, not Spanish. German dominates Central Europe, but a surprisingly large chunk of Austria appears to be tweeting in Italian - as do a lot of dots inside France. 
 The map of the eastern United States is equally fascinating:


Population centers and highways pop immediately, as do portions of Quebec. The images are a triumph of data mining and creative visualization. The much larger full world map on Flickr is worth checking out.

Sunday, November 6, 2011

The Amazing Octopus

An quick recommendation: Orion Magazine's current issue includes a story about the octopus's vast mental abilities, many of which are shocking and amusing at the same time. It's a short piece and well worth reading.

Tuesday, September 20, 2011

Science and the Law

The September 2011 edition of Nature includes a fascinating look at the background to a trial underway in Italy. Authorities there have indicted several scientists and politicians for failing to warn the public adequately about an earthquake in the ancient town of L'Aquila that took the lives of over 300 people in 2009. The case raises interesting questions about (1) how the media and government officials communicate the often nuanced positions of scientific authorities, (2) the ways in which statements by scientists are influenced by the public, and (3) the extent to which people are willing to trust authority.

Wednesday, June 29, 2011

New Words: 1927 and 2011

The Atlantic's Alexis Madrigal recently posted 15 new words from the 1927 Webster's International Dictionary. The additions include airplane, jazz, movie, and windshield. It's interesting to compare these to the words the Oxford English Dictionary added this year.

Monday, June 27, 2011

The danger of computers filtering Internet content

Online organizer Eli Pariser delivered a fascinating TED talk last month about how computer algorithms filter what we come across on the Internet. He calls for changes in corporate practice and policy in order to prevent what could be a grim future for online public discourse.

Friday, June 24, 2011

More on Free Will

The Boston Globe's Josh Rothman recently detailed new research into free will. Responding to the work of Benjamin Libet, who, in the 1970s, conducted a series of experiments that suggested that the experience of consciously making a decision follows brain processes that dictate action (i.e. that we lack free will), a new group of researchers have conducted more precise experiments using new neuroscience technologies. The new research suggests that making choices involves both our minds and our bodies, complicating Libet's theory and offering a more complex view of free will.

Friday, June 17, 2011

Free Will in Limited Supply?

Is self-control a limited resource? Do we deplete our supply as we make decisions throughout our day? A recent piece in The New Republic by Jamie Holmes explores social psychology research that suggests that willpower is, indeed, limited. Experiments have demonstrated that people who complete tasks which require them to exert self-control become less able to exert self-control on subsequent tasks.

The article focuses on poverty and asks, why can't more poor people escape poverty? Holmes offers a brief introduction into the psychology of economic decision making. He then argues that poor people must routinely make weighty decisions which wealthier people would consider trivial—whether to buy food or medicine, for example. Making these decisions exerts a mental cost, depleting the willpower of the poor, and making their escape from poverty even more challenging.

Wednesday, June 15, 2011

Summer... and the Science of User Interface Design

Summer has arrived! While I hope to continue to post at least three times a week, I plan to write shorter posts. These will include links to interesting articles that I find online—with less analysis. I hope that those of you who have come to read my blog on a regular basis—and there are a handful of you out there—will continue to find the blog an enjoyable stop on your journey across the web.

For now, please enjoy this article about user interface design from the Fast Code Design blog (via Brainiac).  The author examines why software and operating system designers lay out user options in the way that they do. He asks, for example, why, on computer dialogue boxes, is the 'OK' button usually to the right of the 'Cancel' button. His analysis peers into the psychology of user interface design.

Monday, June 13, 2011

World War II: The importance of continued scholarship

Several weeks ago, the New York Times' Sunday Book Review published a review of several World War II history books. Written by Adam Kirsch, the review notes that a new crop of books, written by historians now more than a generation removed from the war, complicate our common understanding of World War II as, unequivocally, a "good war." These historians do not challenge historical fact; rather they seek to expose and to evaluate the moral implications of a wide variety of actions—from Churchill's complicity in the Bengal famine to Allied aerial area bombardments of German cities.

In addition to reviewing scores of books, Kirsch provides a thoughtful analysis of the importance of historical scholarship—especially scholarship that makes us see the past with fresh eyes. Many of the new books about World War II remind us that with war comes moral challenges—yet we cannot turn a blind eye to injustice. Kirsch writes:
After all, the present is always lived in ambiguity. To those who fought World War II, it was plain enough that Allied bombs were killing huge numbers of German civilians, that Churchill was fighting to preserve imperialism as well as democracy, and that the bulk of the dying in Europe was being done by the Red Army at the service of Stalin. It is only in retrospect that we begin to simplify experience into myth—because we need stories to live by, because we want to honor our ancestors and our country instead of doubting them. In this way, a necessary but terrible war is simplified into a “good war,” and we start to feel shy or guilty at any reminder of the moral compromises and outright betrayals that are inseparable from every combat.

The best history writing reverses this process, restoring complexity to our sense of the past. Indeed, its most important lesson may be that the awareness of ambiguity must not lead to detachment and paralysis—or to pacifism and isolationism, as Nicholson Baker and Pat Buchanan would have it. On the contrary, the more we learn about the history of World War II, the stronger the case becomes that it was the irresolution and military weakness of the democracies that allowed Nazi Germany to provoke a world war, with all the ensuing horrors and moral compromises that these recent books expose.

Friday, June 10, 2011

The Uncertainty of Our Inner Lives

We rely on our intuitions, emotions, perceptions, and memories to make sense of our world, to get along with other people, to make decisions—simply, to survive. Yet it's nearly impossible for us to verify them and, claims Eric Schwitzgebel, philosopher and author of Perplexities of Consciousness, we actually have a poor grasp of them. He argues that we are “poorly equipped with the tools, categories, and skills that might help [us] dissect them." What's more, we rarely question our minds and are inclined to a sort of cognitive hubris—at least when it comes to how we feel, see, and think. "When you’re in a position where you are the sole authority on something," he says, "that tends to artificially inflate your confidence."

The Boston Globe recently published an interview with Schwitzgebel, in which he explains his contention that we should be skeptical about our inner lives—even more skeptical than we are about the world outside of the mind. Our knowledge of self, he says, is weak knowledge at best. The author of the interview, Joshua Rothman, preludes the piece by recounting a famous experiment conducted by Solomon Asch:
In 1951, the psychologist Solomon Asch gathered seven college students around a table and presented them with two cards. On one, he’d printed a single vertical line; on the other, three lines of varying lengths. Going around the table, Asch asked each student a simple question: Which of the three lines was the same length as the solitary one? Asch’s secret was that all but one of the “volunteers” were actors, with instructions to answer incorrectly. While the actors contributed their wrong answers, Asch watched the real volunteer, who always went last. Would he give in to the pressure of the group?

The results were unsettling: When they had to go against the group, 75 percent of Asch’s volunteers gave at least one wrong answer, often without knowing it. Psychologists have long cited Asch’s experiments as sublime demonstrations of “groupthink.” But they also point to a more subtle and disquieting aspect of our inner lives: They suggest just how easily our confidence in our own perceptions, memories, and inner experiences can be shaken. Most of us assume that we know, with omniscient certainty, exactly what we’re thinking, feeling, and perceiving. Asch’s experiments force us to question that certainty. If we’re so sure of what’s going on in our own minds, then how can we be so easily persuaded to change them?
I understand why most people find the results of Asch's study unsettling—just as they do Schwitzgebel's suggestion that we can know little with certainty about what we think or feel. We find comfort in believing we know our own emotions, have memories about which we can be certain, and can perceive the world around us without interference from other people. When asked if he finds his ideas unnerving, Schwitzgebel said:
I may be unusual in a certain way, in that I find being cast into doubt and uncertainty kind of liberating and exhilarating and fun. When I read a piece of philosophy or piece of psychology or science fiction, and it throws me off and confuses me and bewilders me, and calls into doubt what I thought I knew—that lights my candle, that’s what I really like.
While Schwitzgebel enjoys the thrill of uncertainty—did he take Theory of Knowledge?—he contends that we can move towards certainty. "In coming to self-understanding," he says, "we can use introspection to some extent....We can also use third-person evidence." I find Schwitzgebel's ideas quite interesting. But then again, how certain can I be of that?

Wednesday, June 8, 2011

Metaphors and National Defense

In a post last week, I discussed a new book about metaphors—their ubiquity, complexity, and importance in thought and communication. Apparently, US intelligence personnel have become quite interested in metaphors as well. A colleague recently pointed me to an article by Alexis Madrigal, a senior editor at The Atlantic, which details the The Metaphor Program, a project of the government's Intelligence Advanced Research Projects Activity.

According to the program's website, "The Metaphor Program will exploit the fact that metaphors are pervasive in everyday talk and reveal the underlying beliefs and worldviews of members of a culture." The program, through grants to computer scientists and linguists, seeks to develop computer algorithms that can identify conceptual metaphors used by those who may seek to harm the US. Such an algorithm could be deployed to the corners of the Internet and perform the work of an army of intelligence officers. To be effective, such an algorithm would need to be exceptionally knowledgeable. "What IARPA's project calls for is the deployment of spy resources against an entire language," Madrigal notes. "Where you or I might parse a sentence, this project wants to parse, say, all the pages in Farsi on the Internet looking for hidden levers into the consciousness of a people."

The complexity of human language and culture makes developing such an algorithm a great challenge. Metaphorical language can change—what would it have meant in 1980 to surf the information superhighway?—and is highly contextual. Madrigal explores this complexity:
While some of the underlying structures of the metaphors—the conceptual categories—are near universal (e.g. Happy Is Up), there are many variations in their range, elaboration, and emphasis. And, of course, not every category is universal. For example, Kövecses points to a special conceptual category in Japanese centered around the hara, or belly, "Anger Is (In The) Hara." In Zulu, one finds an important category, "Anger Is (Understood As Being) In the Heart," which would be rare in English. Alternatively, while many cultures conceive of anger as a hot fluid in a container, it's in English that we "blow off steam," a turn of phrase that wouldn't make sense in Zulu.
The Metaphor Program assumes, Madrigal notes, that the metaphors someone uses can tell us a great deal about that person's worldview—perhaps more than the person says in plain language. The applications for a machine that can detect and analyze metaphors, then, extend well beyond national security. The fruits of The Metaphor Program will help advance the cause of artificial intelligence in many domains.

Monday, June 6, 2011

On Unintelligent Voting

I try to vote in every election I can—school board, village president, state legislature, governor, and, of course, federal legislature and President. I am not, however, always a well informed voter. Take, for example, my vote during my village's local school board election this past April. While I received a flyer in the mail from each candidate that listed his or her platform, all of the platforms included the same platitudes. Instead of investigating each candidate's positions further by, for example, attending a candidate forum, I voted for the candidate who attended the same undergraduate college that I did. My vote was not well informed, but at least I had voted—unlike 90% of my fellow village citizens. I had done my civic duty, right?

Perhaps not. In The Ethics of Voting, Brown University professor Jason Brennan argues that while voting may be a fundamental American right, voting and voting well are two different acts. Voting, like singing, can be done well or badly—and one has no obligation to do it at all. Josh Rothman, who reviewed Brennan's book in a recent post on the Boston Globe's Brainiac Blog, summarizes his main argument:
To vote well, Brennan argues, you actually need to be thinking at a very high level. It's not enough to know which policies different candidates support. You also need to have "epistemically justified" opinions about those policies—which, in many cases, means drawing on "social-scientific background knowledge." That knowledge is hard to acquire, which is why reasonable people can disagree about their votes while also voting well; the point is that they've done their due diligence and taken voting seriously.
Rothman reports that Brennan is not suggesting we abandon civil engagement. Rather, he is urging citizens to consider voting as an optional responsibility—one which the individual need not choose to burden himself with if he is not prepared to educate himself. Because we are not obligated to vote, Brennan argues, if we choose to vote we have an ethical responsibility to our fellow citizens to do so well. I will need to heed Brennan's advice the next time the polls open in my town.

Friday, June 3, 2011

Book Review: I is an Other

"Whenever we describe anything abstract—ideas, feelings, thoughts, emotions, concepts—we instinctively resort to metaphor," writes James Geary in I Is an Other: The Secret Life of Metaphor and How It Shapes the Way We See the World. I came across Geary's book in my local public library and found it an easy read. Since then, I have found myself dissecting and categorizing metaphors—just as Geary does throughout his book. "We utter about one metaphor for every ten to twenty-five words," he notes early in the book, so it's no wonder that they keep on popping up everywhere.

The word metaphor, Geary explains, derives from two Greek roots: meta (over, across, or beyond) and phor (to carry). Metaphors allow us to carry ideas from one realm to another, and they do so beautifully and efficently. Geary spends the bulk of the book demonstrating the ubiquity and power of metaphors. Consider his analysis of the word shoulder:
You can give someone the cold shoulder or a shoulder to cry on. You can have a chip on your shoulder or be constantly looking over your shoulder. You can stand on the shoulders of giants, stand shoulder to shoulder with your friends, or stand head and shoulders above the rest. Wherever you turn, you can't help but rub shoulders with one of the word's multitude of metaphorical meanings.
Consider also his analysis of the ways in which we seek to understand finance through metaphor: 
Flick on the business news and you're in for a smorgasbord of financial metaphor. Gasp in horror as the bear market grips Wall Street with its hairy paws; then cheer as fearless investors claw back gains. Watch in amazement as the NASDAQ vaults to new heights; then cringe as it slips, stumbles, and drops like a stone. Wait anxiously to see if the market will shake off the jitters, slump into depression, or bounce back.
From literature to advertising, from comedy to tragedy, from science to art—metaphors make communication possible. In each of the book's fifteen chapters, Geary explores a different use of metaphor. His analyses pull from a range of disciplines—including linguistics, history, and psychology—and his writing is sharp. I highly recommend the book for anyone interested in taking a journey into the land of metaphor.

Wednesday, June 1, 2011

Craniopagus Twins

Last weekend, the New York Times magazine featured a story about a set of four-year-old conjoined twins, Krista and Tatiana Hogan, who, unlike most conjoined twins, are joined at the head. They are craniopagus twins. The author of the piece, Susan Dominus, provides both an anthropological portrait of the girls—she spent five days with the twins and their extended family—and an exploration of the meaning of identity.

What makes Krista and Tatiana's story so fascinating is that the girls appear to share sensory information. One sees something funny on television, for example, and the other laughs. Images of the girls' brains reveal an anatomical link between their thalamuses—a thalamic bridge. "The thalamus is a kind of switchboard," explains Dominus, "a two-lobed organ that filters most sensory input and has long been thought to be essential in the neural loops that create consciousness." The girls' doctors believe that sensory input from one of the girls can cross the thalamic bridge into the brain of her sister.

Dominus reports that she witnessed, over and over again, remarkable sensory connection between the girls. "Over the course of the days I spent with them," she writes, "I witnessed the girls do seemingly remarkable things: say the precise name of the toy that could only be seen through the eyes of her sister or point precisely, without looking, to the spot on her sister’s body where she was being touched." She also saw the girls fight when one of them wanted to eat chicken fingers and the other objected violently because she didn't like the taste.

In short, Krista and Tatiana seem, on some level, to share a mind. This makes them fascinating subjects for neuroscientists. Dominus explains:
The average person tends to fall back on the Enlightenment notion of the self—one mind, with privacy of thought and sensory experience—as a key characteristic of identity. That very impermeability is part of what makes the concept of the mind so challenging to researchers studying how it works, the neuroscientist and philosopher Antonio Damasio says in his book, "Self Comes to Mind." "The fact that no one sees the minds of others, conscious or not, is especially mysterious," he writes. We may be capable of guessing what others think, "but we cannot observe their minds, and only we ourselves can observe ours, from the inside, and through a rather narrow window."
Assuming Krista and Tatiana survive into adolescence and beyond—and I certainly hope they do—neuroscientists will seek to gauge if the girls are able to share abstract thoughts, just as psychologists will be eager to understand how their condition has shaped their identities. For the time being, despite physiological challenges, Krista and Tatiana, Dominus reports, are remarkably well-adjusted children—daily reminding us of the power of the brain.



In addition to reading the full article, I recommend this video about the twins:

Monday, May 30, 2011

Are There Natural Human Rights?

In a post on the Opinionator blog yesterday, philosophy professor Michael Boylan asks, Are There Natural Human Rights? He rightfully claims that this question is more than academic, as much international policy is built upon the the position that there are universal, natural human rights:
International policy would cease to be able to advocate universally for certain fundamental rights—such as those set out in the United Nations’ Declaration of Human Rights or the United States’ Bill of Rights and Declaration of Independence or Liu Xiaobo’s "Charter08." And of course, the idea that NATO, France, the United States or any other country should intervene in Libya would have never arisen. Instead, each nation would be free to treat its citizens as it chooses, subject only to the rule of power.  Hitler would not have been wrong in carrying out the Holocaust, but only weak because he lost the war. The logical result of such a position is a radical moral relativism vis-à-vis various cultural anthropologies.
Boylan reviews the history of conflicting responses to this question. Some people argue that the entire notation of natural human rights was created during the European Enlightenment, that what we consider natural human rights are merely social constructs. Others counter by arguing (1) that thinkers before the Enlightenment did discuss human rights even if they didn't use the terms we use today or (2) that there is a logical basis for proving the existence of natural human rights—a basis that transcends the history of writings about rights. Boylan goes on to offer a scholarly examination of these positions that draws on philosophy and history.

At the end of his post, Boylan suggests that our position on this question colors how we understand world events—such as the Arab Spring uprisings. He offers a thought experiment for his readers to help them determine their position on the question:
I have a thought experiment that might help the reader decide what he or she thinks is the correct position: imagine living in a society in which the majority hurts some minority group (here called “the other”). The reason for this oppression is that “the other” are thought to be bothersome and irritating or that they can be used for social profit. Are you fine with that?  Now imagine that you are the bothersome irritant and the society wants to squash you for speaking your mind in trying to improve the community. Are you fine with that? These are really the same case. Write down your reasons. If your reasons are situational and rooted in a particular cultural context (such as adhering to socially accepted conventions, like female foot binding or denying women the right to drive), then you may cast your vote with Hart, Austin and Confucius. In this case there are no natural human rights. If your reasons refer to higher principles (such as the Golden Rule), then you cast your vote with the universalists: natural human rights exist.  This is an important exercise. Perform this exercise with everyone you are close to—today—and tell me what you think.

Friday, May 27, 2011

Language and Gender: Kate Swift

As I mentioned in a post last month, the Sapir-Whorf theory holds that the words we know and use impact how we conceptualize our world and, perhaps, act in it. I was reminded of this after reading the obituary of Kate Swift earlier this month. Swift, a feminist wordsmith and writer, spent much of her professional life, along with her long-deceased partner Casey Miller, shedding light on gender assumptions in language.

Swift and Miller were awakened to gender bias in language when they were asked to copy-edit a sexual education textbook in the 1970s. Describing their awakening, Swift and Miller noted in the introduction to one of their later books, "everything we read, heard on the radio and television, or worked on professionally confirmed our new awareness that the way English is used to make the simplest points can either acknowledge women’s full humanity or relegate the female half of the species to secondary status." They found, for example, that many authors assumed all police officers are men or refereed to women by the color of their hair.

The obituary notes that Swift and Miller had a profound yet limited impact:
Some of the authors’ proposals gained traction. Many newspapers, textbooks and public speakers avoid “fireman” and “stewardess” nowadays. Other ideas fell by the wayside, notably “genkind” as a replacement for “mankind,” or “tey,” “ter” and “tem” as sex-neutral substitutes for “he/she,” “his/her” and “him/her.”
Swift and Miller's work demonstrates the ability we have—as individuals and as societies—to reflect upon and change the words we use.

Wednesday, May 25, 2011

To Lie is Human?

Is lying always wrong? If not, when is it right? Julian Baggini considers these questions—and many others—in his review of Ian Leslie's new book Born Liars: Why We Can’t Live Without Deceit, not yet available, it seems, in the United States.

Baggini begins his exploration by examining the simple maxim "always tell the truth," which we are told as children to obey. We soon discover, however, that telling the truth can be dangerous and that telling the "whole truth" is almost always impossible.  Baggini explains how defining "the truth" is a complex epistemological problem:
The problem with telling “the truth” starts with the definite article, because there is always more than one way to give a true account or description. If you and I were to each describe the view of Lake Buttermere, for example, our accounts might be different but both contain nothing but true statements. You might coldly describe the topography and list the vegetation while I might paint more of a verbal picture. That is not to say there is more than one truth in some hand-washing, relativistic sense. If you were to start talking about the cluster of high-rise apartment blocks on the southern shore, you wouldn’t be describing “what’s true for you,” you’d be lying or hallucinating.

So while it is not possible to give “the truth” about Lake Buttermere, it is possible to offer any number of accounts that only contain true statements. To do that, however, is not enough to achieve what people want from truth. It is rather a prescription for what we might call “estate agent truth.” The art of describing a home for sale or let is only to say true things, while leaving out the crucial additional information that would put the truth in its ugly context. In other words, no “false statement made with the intention to deceive”—St Augustine’s still unbeatable definition of a lie—but plenty of economy with the truth.
Baggini goes on to categorize two ways of thinking about the truth: moral and legalistic. To speak truthfully in a legalistic sense is to say nothing that is a (known) lie. When Bill Clinton said, "I did not have sexual relations with that woman," he was, in a legalistic sense, telling the truth. He felt no burden to reveal anything more. On the other hand, to speak truthfully in a moral sense requires us to speak honestly, sincerely, and accurately. Moral truthfulness, Baggini writes, borrowing the ideas of the philosopher Bernard Williams, "requires more than just true things being said, while acknowledging that there really is no such thing as 'the whole truth' anyway."

Baggini then goes on to explore the question: Does the truth always trump other virtues? He considers the merits of several arguments against the contention that truth telling—in the moral sense—is the greatest virtue:

Feelings. We may choose not to tell people how we feel about something in order to spare their feelings. "Little white lies" allow us to get along well with others, right? Baggini argues that we must tread carefully when adopting this rationale. He writes, "There is a risk of second guessing what is best for people or what we think they are able to deal with. Normally, it is better to allow people to make up their own minds on the basis of facts. Withholding truth for someone’s own benefit is sometimes justified but often it simply diminishes their autonomy. This is what Kant got right when he claimed that lying violates the dignity of man."

Personal Dignity and Privacy. We may choose to withhold information in order to protect our dignity. Arguably, this may have been what Clinton sought when he delivered his aforementioned statement about his relationship with Monica Lewinsky. On this account, Baggini asks, "If what you did is nobody else’s business, aren’t you entitled to lie to preserve your privacy?"

Hysteria. We often wish that our political leaders would speak simply and tell us the truth. We often feel that politicians are lying to us because, in so many cases, we find out that they are. Baggini suggests, however, that we may not appreciate the value of politicians limiting their expression. "Would it really be wise for a prime minister to announce, when a crisis breaks, that no one really knows what’s going on yet or has a clue what to do next?" he asks. "Leadership in a crisis may require projecting more calm and control than one really has behind closed doors. More honesty in politics would certainly be a good thing; complete honesty most probably disastrous."

Greater Good. We need not say what we truly believe so long as we are working toward the establishment of truth in the long run. Sociologist Steve Fuller, Baggini notes, contends that intellectuals must, at times, conceal their deepest convictions in order to further debate. Fuller, who has argued that scientists should not reject intelligent design theory out of hand, plays the devil's advocate. He withholds the truth in order to require his colleagues to think critically about evolution.

Baggini does not dismiss these four arguments against the truth. But, he argues, telling the truth does matter. "You could concoct a hypothetical situation in which we had to choose between lying or creating misery for all humankind," he writes, "but until and unless we ever come against such scenarios, most of us value truth, even to the detriment of some happiness. That is why we should develop the habit of telling truth, and distaste for lies. Truth should be the default; lying an exception that requires a special justification."

Still, Baggini concludes the article by arguing that "lying is deeply connected to what makes us human," an argument Leslie makes in Born Liars. Baggini writes:
We may not be the only creatures who have a “theory of mind”— the ability to see the world from the point of view of others—but we are certainly the species in which that capacity is most developed. It is precisely because of this that the possibility of lying emerges. We can lie only because we understand that others can be made to see the world other than as we know it to be.
So perhaps, as the title of Leslie's book suggestions, we are born liars—at least in some sense of the word.

Monday, May 23, 2011

You know what tag questions are, right?

Earlier this month, the Boston Globe's Erin McKean, founder of Wordnik.com, presented a fascinating examination of tag questions, "those little questioning upticks, usually found at the end of a sentence." As speakers, we use tag questions unconsciously; as listeners, we rarely realize when someone is asking us one. Nonetheless, tag questions, small waves in the vast ocean of spoken language, play an important part in communication.

McKean notes that linguists have identified two kinds of tag questions: modal and affective. Modal tag questions seek information or confirmation: We're going to the movies on Tuesday, right? He should really change his hairdo, shouldn't he? Affective tag questions seek to soften the meaning of a statement or convey an emotional connection to an audience: This is how you change a light bulb; simple, right? That was a horrible movie, no? Linguists have not only categorized tag questions but also studied their use. McKean presents the findings from some interesting investigations: 

Culture. Modal tag questions tend to be the same across regions and cultures, while affective tags vary across regions and cultures. Consider that in the South you're likely to hear "you hear?" while in Canada you're likely to hear "eh?" at the end of the same sentence. 

Gender. Researchers used to identify tag questions with femininity, but they have since discovered that men use tag questions as frequently as—or in some cases, more frequently than—women. 

Power. "Powerful" speakers, "people who are in charge of making sure conversations go well," like teachers and doctors, tend to use tag questions more than other types of speakers.

McKean notes that tag questions are so ubiquitous because they are efficient. They help us connect quickly and avoid misunderstandings, and they "grease the conversational wheels." Interesting, right?

Friday, May 20, 2011

Memory: Collins' Forgetfulness

I recently came across Forgetfulness, a beautiful poem by Billy Collins, the Poet Laureate of the United States from 2001 to 2003. In the short piece, Collins addresses the frailty of memory with evocative imagery. 
Forgetfulness 
by Billy Collins 

The name of the author is the first to go
followed obediently by the title, the plot,
the heartbreaking conclusion, the entire novel
which suddenly becomes one you have never read,
never even heard of,

as if, one by one, the memories you used to harbor
decided to retire to the southern hemisphere of the brain,
to a little fishing village where there are no phones.

Long ago you kissed the names of the nine Muses goodbye
and watched the quadratic equation pack its bag,
and even now as you memorize the order of the planets,

something else is slipping away, a state flower perhaps,
the address of an uncle, the capital of Paraguay.

Whatever it is you are struggling to remember
it is not poised on the tip of your tongue,
not even lurking in some obscure corner of your spleen.

It has floated away down a dark mythological river
whose name begins with an L as far as you can recall,
well on your own way to oblivion where you will join those
who have even forgotten how to swim and how to ride a bicycle.

No wonder you rise in the middle of the night
to look up the date of a famous battle in a book on war.
No wonder the moon in the window seems to have drifted
out of a love poem that you used to know by heart.
The advertising agency JWT produced an animation to accompany a reading of the poem:

Wednesday, May 18, 2011

A College Degree: 55 Years Deferred

When it comes to history, many students seem to embrace the solipsistic position that the existence of any event that occured before their year of birth is unimportant at best, dubious at worst. As a history teacher, I aim to help my students understand how deeply the events of the past influence contemporary society and our place in it. As a result, whenever I come across a human interest story that sheds light on the influence of the past on people today, I must share it with my students.

This past week, the New York Times featured such a story: In 1958, Burlyce Sherrell Logan left the University of North Texas after facing intensely racist bullying. She worked and raised a family and finally, in 2006, returned to the University. This past weekend, she earned her college degree, and her grandchildren celebrated with her.

While Logan's story is simple, her determination reflects the best of the human spirit, just as her delayed graduation demonstrates the ways in which the past can touch the present and the present can transcend the past. The article serves as a reminder that history is alive—something Logan knows well. At the end of the piece, she quips, “In September, I’m going to start on my master’s in history."

Monday, May 16, 2011

Twitter in the Classroom

Several months ago, a student in one of my TOK classes investigated the ways in which Twitter has changed knowledge—its generation, its dissemination, and our relationship to it. Given the role that social media has played and will continue to play in recent world affairs, I delighted in having a student pick this topic.

During her presentation, the student asked the audience, other students, to take out their phones. Normally, phones must remain off during the school day, but I allowed the students to engage with them given the topic at hand. The student presenter then used a series of tweets to outline her presentation. Only minutes into the presentation, many other students tweeted responses to her ideas in real time. The students had begun conversing about the questions raised in the presentation via Twitter long before the presenter opened the floor to discussion.

This presentation offered me the opportunity to witness students communicating via a backchannel, an electronic conversation taking place alongside—but outside of—a real-world one. Last week, the New York Times featured a short article—Speaking Up in Class, Silently, Using Social Media—that detailed how some teachers, including some elementery school teachers, are engaging with students and gauging student interest via backchannels.

The author of the Times' article noted that "real-time digital streams allow students to comment, pose questions (answered either by one another or the teacher) and shed inhibitions about voicing opinions. Perhaps most importantly, if they are texting on-task, they are less likely to be texting about something else." I like the idea that teachers allow students to participate in discussions electronically, especially if doing so will help students engage more meaningfully in course material. At the same time, much would be lost if electronic participation were to replace face-to-face conversation. Perhaps next year, I'll explore using social media in the classroom to create backchannel conversations, so long as these exchanges add value to the educational experience.

Friday, May 13, 2011

Emotion and the Perception of Time

We have all experienced moments in life when time seems to come to a standstill—a boring lecture, for example—and when time flies. What makes some minutes feel longer than others? A recent piece in Slate's The Explainer tackles this question by providing insights into the relationship between emotion and perception, two ways of knowing that can intersect in fascinating ways.

Using a quote about time from President Obama's recent interview with 60 Minutes—the President described watching the raid on Osama Bin Laden's compound as "the longest 40 minutes of my life"—the author of the piece, Jeremy Singer-Vine, summarizes the findings from several psychological studies. These studies suggest, not surprisingly, (1) that time seems to slow down when we experience events that elicit negative emotions, cause confusion, or are novel and (2) that time seems to speed up when we experience joyful or fun events. Psychologist Mihaly Csikszentmihalyi's idea of flow, explained nicely by Csikszentmihalyi in a 2008 TED talk, corroborates these findings and reminds us that joyful events can also be challenging ones. 

Singer-Vine then goes on to explain that psychologists have examined both how people feel about events in the moment and how they recall feeling about them after the fact. These studies into "temporal cognition," into the contrast between prospective and retrospective cognitive states, suggest that time feels slower in both prospective and retrospective terms when you're anxious, nervous, or waiting for something to happen. This helps explains why Obama both felt that time slowed down while watching the raid and remembers time having slowed down, too.

Wednesday, May 11, 2011

Cognitive Science and Railroad Tracks

After writing last week about Dan Pink's chronicles of emotionally intelligent signage, I enjoyed reading a story in this past weekend's Boston Globe about an experiment that Indian authorities have undertaken in an attempt to reduce railroad crossing deaths in and around Mumbai. The authorities, alarmed that, on average, trains kill 10 people each day as they're crossing tracks in and near Mumbai, sought the help of Final Mile, a “behavior architecture” firm, "which uses the lessons of cognitive psychology to influence people on the brink of making decisions." By taking psychology into account in developing methods to dissuade people from crossing railroad tracks illegally—by recognizing that people usually ignore traditional "caution" signs—Final Mile has designed and implemented a number of interventions that have proven dramatically successful.

The interventions are as subtle as they are ingenious. The author of the article describes them well:
First, Final Mile painted alternate sets of railway ties in fluorescent yellow — five painted, five unpainted, and so on — to tackle what is known as the Leibowitz Hypothesis. As laid out in a 1985 issue of American Scientist by experimental psychologist Herschel W. Leibowitz, the hypothesis found that we frequently underestimate the speed at which large objects move. Leibowitz, who died earlier this year, first observed this with aircraft, and in 2003, a pair of scientists proved the hypothesis for trains. “The misperception happens because the brain has no frame of reference, no way to evaluate roughly how fast a train is moving,” said Satish Krishnamurthy, a Final Mile behavior architect. But with the new paint job, Krishnamurthy said, “the mind now has a way to gauge the train’s speed, by observing how fast it traverses these ties.”

Second, the consultants replaced the stick-figure signboards with a graphic three-part tableau, featuring in extreme close-up the horror-struck face of a man being plowed down by a locomotive. “We hired an actor,” Krishnamurthy said, smiling, “because it had to be realistic.” They were drawing on the research of Joseph LeDoux, a New York University professor of neuroscience and psychology. LeDoux studies the links between emotion and memory, and in particular the mechanism of fear. “Emotional memory is stored in the nonconscious part of your brain,” Dominic said. “If you’ve been in a car crash and, months later, you hear tires squealing, your heart rate goes up and you start to sweat. That’s because your emotional memory has been stirred up.” The new signs dispense with explanatory text and instead attempt to trigger an emotional memory of fear.

Final Mile’s third intervention required train drivers to switch from one long warning whistle to two short, sharp blasts. By way of explanation, Dominic cited a 2007 paper from the Stanford University School of Medicine, which found that brain activity — and hence alertness — peaks during short silences between two musical notes. “The silence sets up a kind of expectation in the brain,” said Vinod Menon, the paper’s senior author and a behavioral scientist working with the Stanford Cognitive and Systems Neuroscience Lab. “That’s the way it works in music, and it isn’t inconceivable that it would work similarly with train whistles.”
I'm excited not only by the work that Final Mile is doing but also by the willingness of public officials to experiment in the development of public policy. The article notes that government agencies from around the world are starting to take psychology into account when creating mechanisms to communicate with—and sometimes warn—the public.

Monday, May 9, 2011

Historical Fact: Did Bin Laden Have a Gun?

The story of how elite United States military forces carried out the assassination of Osama Bin Laden has captured the attention of people the world over. I've been fascinated by the accounts and the commentary. Historians may have difficulty, however, piecing together some of the details, as numerous reports from government officials have contained contradictory data about the weapon(s) which Bin Laden may (or may not) have been holding (firing?) when US troops killed him. On a recent post on his blog, David Weigel chronicles some of these contradictory accounts.

As they attempt to write about the assassination, professional historians will have to grapple with the contradictions. They may settle on an agreement of historical fact, consensus gentium, or they may use the conflicting accounts to suggest something about the fog of war, the failure of the White House to coordinate its message, or the deficiencies of memory. In an attempt to gain clarity, they will likely seek out interviews with the men who stormed Bin Laden's compound.

Students who investigate the event online may also encounter conflicting accounts, which will likely persist even as new and perhaps more accurate accounts emerge. Sadly, some students may stop searching after finding only one account because they assume that the one they have found is historical fact. Writing an account of the past is challenging, and so is helping students develop a healthy skepticism about the sources they rely upon to understand it.

Friday, May 6, 2011

Teaching Idea: Emotionally Intelligent Signs

On his blog, Daniel Pink routinely posts photos of emotionally intelligent signage—signs that not only direct but also take into account people's emotions in order to be maximally effective. His readers have sent him images from around the world—for example, no smoking notices in an Austin, TX, hotel and, earlier this week, Dutch road signs aimed a curbing road rage—that reflect innovative and creative thinking.

In schools, we (adults) often need to remind students regularly to take (or not take) certain actions, yet students often become immune to our messages. Be quiet in the hallway. Don't leave your book bag on the floor. Shut of the lights when you leave the room. Recycle your scrap paper.  Rarely do we communicate written admonitions with emotionally intelligent signage. And here's where I imagine TOK students could contribute to their school communities.

As a way to explore emotion as a way of knowing, I could imagine offering my TOK students the opportunity (in contest form, perhaps) to design emotionally intelligent signage for the school community. For example, students might be tasked with developing an emotionally intelligent sign to encourage students to be quiet during examination periods. A traditional sign might read, "Please by quiet. Testing in progress." But an emotionally intelligent sign might read, "Quiet Please! You'll be taking an exam sooner or later, too. Please respect those who are taking one now." I could envision a group of TOK students coming up with some interesting—and effective—signs.

Wednesday, May 4, 2011

Book Review: Relating to Adolescents

Several months ago, in an electronic newsletter published by the Klingenstein Center, I read about Relating to Adolescents: Educators in a Teenage World by Susan Eva Porter. The book sounded like a worthwhile read, so I ordered a copy from Amazon. I'm glad I did.

Dr. Porter, an experienced teacher and clinician, makes a strong case that adults in schools need to act with intentionality when dealing with adolescents—a claim that may sound obvious but which bears repeating at regular intervals. In the introduction, she reminds the reader that "teenagers need us to practice certain skills that allow us to keep our roles clear, to maintain the boundaries between us, and to keep our cool when the energy of adolescence swirls around us." To illustrate her points about adult behavior, she provides myriad case studies, which enable the reader to grasp the importance of her claims with ease.

I so enjoyed Porter's practical approach that I selected an excerpt from her text as the basis for a session of a Professional Learning Community that I led earlier this school year. In the selection, from a chapter entitled "The Eightfold Path of Adult Self-Care," Porter takes the practices of the traditional Buddhist Eightfold Path and appropriates them for the contemporary educator. For example, in explaining the importance of "Right Intention," Porter notes that "teenage thoughts and emotions are all over the map, and they spread like wildfire.... To take care of ourselves, then, we should understand not only how to track our own thoughts and moods but also how to separate our internal processes from those of our students." Shes goes on to provide case studies and questions for the reader's consideration.

The participants in the PLC and I greatly enjoyed reading and discussing the selection from Porter's book. It sparked a tremendous amount of self-reflection while affirming some common practices as well. Porter writes to the educator in a practical, clear voice, and her book makes for an excellent read.

Monday, May 2, 2011

NBC's Parks and Recreation asks: What is art?

I recently received an email from one of my former students who wrote, "TOK seems to pop up everywhere now." I was not surprised by her revelation. Once you study epistemology and become attuned to looking for knowledge issues, you realize the great extent to which questions of knowledge underpin most of what you read in journals, magazines, and news reports.

On occasion, knowledge issues also creep into popular culture. This week's episode of the NBC sitcom Parks and Recreation, entitled "Jerry's Painting," revolves around the appropriateness of a painting for display in a city government building.


The episode raises several questions about art and art appreciation:
  • What is the proper function of art?
  • How do we know what is good art?
  • How do we evaluate the acceptability of art?
  • To what extent should art reflect the values of the institutions that make it publicly available?
The episode provides few answers to these questions—one imagines the writers never sought to do much more than get the audience to laugh—but it does suggest the complex relationship that people and institutions have with art.

Saturday, April 30, 2011

On Language: Wiping words from your dictionary

Does language shape the way we understand the world? This question, long debated by psycholinguists and TOK students worldwide, is the subject of a column by Erin McKean entitled "The power of not knowing: What’s missing from your dictionary?" which appears in this weekend's Boston Globe.

McKean offers a history of people and groups—everyone from Chairman Mao, to Pope Benedict XVI, to the Philadelphia Flyers—who have proclaimed certain words banished from their vocabularies. McKean notes that such claims of intentional ignorance are quite common (especially, she notes, among men) and reflect the idea that the words we know and use impact how we conceptualize our world and, perhaps, act in it. Known as the Sapir–Whorf theory, this idea aknowledges the power of words and explains why, for example, a politician may proclaim (to a gathering of donors, of course) that he has wiped the word "defeat" from his vocabulary. McKean writes:
The idea of the 'word' surely has a certain power. Other kinds of information and other reference books don’t get quite the same treatment: You can speak of a place or thing as 'not [even] being on the map,' but there doesn’t seem to be a metaphorical equivalent for encyclopedias or almanacs, and in the age of cellphones, the idea of a metaphorically unlisted (or ex-directory) number is almost quaint. Most of the other 'lack' metaphors—being a few bricks short of a load or sandwiches short of a picnic, or not playing with a full deck—seem to indicate mental weakness, rather than strength.

And strength is what the 'not in the dictionary' metaphor is all about: These words that stand for difficult things (defeat, surrender, and failure, as well as other negative words such as can’t and no) may affect other people, but the stalwart individual can overcome them just by pretending the words don’t exist—that they are just strings of meaningless characters. Critical uses of this trope are much rarer: We’re far less likely to say that 'win isn’t in his dictionary' or 'he doesn’t know the meaning of the word victory.'
Most people think, Knowledge is Power. McKean asks us to reconsider this maxim. She notes, "Sometimes deliberate not-knowing, perhaps, is the way to get things done."

Thursday, April 28, 2011

Sugar and Cellphones: On Scientific Challenges

Many people consider the knowledge claims generated by natural scientists to be, for the most part, quite sound. Leucippus may have speculated that atoms compose all matter, but only through the work of a slew of modern scientists—Lavoisier, Dalton, Thompson, and many others—can most of us say, with almost perfect certainly, that we know atoms compose all matter. Of course, the frontiers of science continue to challenge our assumptions about the natural world and our ability to use the scientific method to understand it—but for the most part, most people trust the scientific method and consider the knowledge it has generated quite sound.

Against this backdrop of certainty, the New York Times Magazine published two feature articles earlier this month which, despite their apparently disparate topics, offered fascinating commentary on the complexities—practical, statistical, and political—of the scientific method. The first, Do Cellphones Cause Brain Cancer?, discusses, not surprisingly, the role that cell phones may play in causing brain cancer. On its face, the article demands attention, as the ideas the author presents should be of interest to any cell phone user worldwide. But I found the article's treatment of the scientific method just as intriguing.

The author of the article, Dr. Siddhartha Mukherjee, details three ways that cancer epidemiologists can determine the cause of a particular cancer. "The first," he writes, "is arguably the simplest. When a rare form of cancer is associated with a rare exposure, the link between the risk and the cancer stands out starkly." As an example, he notes that doctors in London discovered, in the late Eighteenth Century, a remarkably high incidence of scrotal cancer among chimney sweeps. He notes that the second way is more challenging: "The opposite phenomenon occurs when a common exposure is associated with a common form of cancer: the association, rather than popping out, disappears into the background, like white noise." The common exposure-common cancer epidemiology problem baffled researchers during the Twentieth Century who could not prove a statistical link between smoking and lung cancer.

The third way that epidemiologists can determine the cause of cancer presents more difficultly still. "A common exposure... associated with a rare form of cancer," Mukherjee writes, "is cancer epidemiology’s toughest conundrum." He goes on:
The rarity of the cancer provokes a desperate and often corrosive search for a cause... And when patients with brain tumors happen to share a common exposure—in this case, cellphones—the line between cause and coincidence begins to blur. The association does not stand out nor does it disappear into statistical white noise. Instead, it remains suspended, like some sort of peculiar optical illusion that is blurry to some and all too clear to others.
Mukherjee devotes the remainder of the article to discussing the challenges scientists face in determining what is (and isn't) a carcinogen. He doesn't provide the reader with a simple response to the question he asks in the title, but he does explain why the scientific method, ever considered tried-and-true, may not provide a pathway for answering the question anytime soon. Appropriately, he ends the piece thus: "Understanding the rigor, labor, evidence and time required to identify a real carcinogen is the first step to understanding what does and does not cause cancer."

The second article, Is Sugar Toxic?, discusses the growing body of nutrition science that suggests that sugar—the sugar we find in small white packets, the high-fructose corn syrup we find in many packaged foods, and the sugar we find in fruit—is toxic to our bodies. Written by Gary Taubes, the author of Why We Get Fat: And What to Do About It, the article focuses on the work of Robert Lustig, a pediatrician made famous by a lecture he delivered in 2009, the video of which has been viewed more than one million times. I encountered a link to the video on a friend's Facebook wall about a week before I read the Times' article. 

Lustig has dedicated his career to demonstrating that sugar is toxic, that it is the leading cause of obesity and cancer. He makes a convincing argument, and while some scientists have bought it, others have not. Not surprisingly, the Sugar Association and the Corn Refiners Association have weighed in, declaring sugar safe and using evidence from the Food and Drug Administration to bolster their claim.

Although the question of sugar's toxicity is important, Taubes' discussion of the challenge that science has faced in attempting to determine the safety of sugar is equally important. Tabues details a powerful argument from Walter Glinsmann, a former F.D.A. administrator and current adviser to the Corn Refiners Association. Glinsmann contends that "sugar and high-fructose corn syrup might be toxic, as Lustig argues, but so might any substance if it’s consumed in ways or in quantities that are unnatural for humans."  Tabues rightfully asks, "at what dose does a substance go from being harmless to harmful? How much do we have to consume before this happens?"

Lustig counters this argument by noting that sugar is not an "acute toxin," one similar to others the F.D.A. regulates, those with effects that "can be studied over the course of days or months." Sugar and high-fructose corn syrup, he claims, are “chronic toxins." Lustig explains that they are “not toxic after one meal, but after 1,000 meals.” This, according to Taubes, is the great challenge facing Lustig and others who wish to prove that sugar is the cause of many common ailments.

In a way, the challenge faced by Lustig is similar to the challenge faced by cancer epidemiologists when they sought to establish a causal relationship between smoking and lung cancer. Many Americans eat excessive amounts of sugar, and many Americans have chronic diseases related to obesity. There appears to be, at the minimum, a correlation between sugar consumption and obesity. But can science prove a causal link between the two? The scientific method doesn't provide an easy way to establish one. Taubes and Mukherjee help us understand why.

Tuesday, April 26, 2011

The Magic Washing Machine

Hans Rosling—global health professor and chairman of Gapminder—makes statistical data about international development come alive. In a December 2010 TED Talk, Rosling argues that washing machines have proven vital in helping women in developing nations contribute to the progress of their societies. While this topic may not excite every viewer at first glace, Rosling brings enough creativity and energy to excite millions.



Why is this presentation so strong? Rosling knows his topic well, he's enthusiastic about addressing his audience, and he begins with a compelling personal story that immediately grabs the reader. Moreover, he uses a creative framing device—the washing machine—to bring to light some fundamental ideas about international development. And he uses not only slides but also props to enhance his message.

Another popular video that demonstrates Rosling's creativity is from The Joy of Stats, a hour-long documentary that he created with the BBC in 2010 and that is available in full online. In this five-minute clip from the documentary, Rosling dazzles his viewers once again with both his knowledge and his creative presentation techniques.



Rosling's ability both to understand social phenomena and to communicate so effectively reminds me of one of my favorite passages from Charles Baudelaire's The Painter of Modern Life. Beaudelaire writes, "Few men have the gift of seeing. Fewer still have the power of expression." We are lucky that Rosling is sharing his gifts with the world.