Pages

Saturday, April 30, 2011

On Language: Wiping words from your dictionary

Does language shape the way we understand the world? This question, long debated by psycholinguists and TOK students worldwide, is the subject of a column by Erin McKean entitled "The power of not knowing: What’s missing from your dictionary?" which appears in this weekend's Boston Globe.

McKean offers a history of people and groups—everyone from Chairman Mao, to Pope Benedict XVI, to the Philadelphia Flyers—who have proclaimed certain words banished from their vocabularies. McKean notes that such claims of intentional ignorance are quite common (especially, she notes, among men) and reflect the idea that the words we know and use impact how we conceptualize our world and, perhaps, act in it. Known as the Sapir–Whorf theory, this idea aknowledges the power of words and explains why, for example, a politician may proclaim (to a gathering of donors, of course) that he has wiped the word "defeat" from his vocabulary. McKean writes:
The idea of the 'word' surely has a certain power. Other kinds of information and other reference books don’t get quite the same treatment: You can speak of a place or thing as 'not [even] being on the map,' but there doesn’t seem to be a metaphorical equivalent for encyclopedias or almanacs, and in the age of cellphones, the idea of a metaphorically unlisted (or ex-directory) number is almost quaint. Most of the other 'lack' metaphors—being a few bricks short of a load or sandwiches short of a picnic, or not playing with a full deck—seem to indicate mental weakness, rather than strength.

And strength is what the 'not in the dictionary' metaphor is all about: These words that stand for difficult things (defeat, surrender, and failure, as well as other negative words such as can’t and no) may affect other people, but the stalwart individual can overcome them just by pretending the words don’t exist—that they are just strings of meaningless characters. Critical uses of this trope are much rarer: We’re far less likely to say that 'win isn’t in his dictionary' or 'he doesn’t know the meaning of the word victory.'
Most people think, Knowledge is Power. McKean asks us to reconsider this maxim. She notes, "Sometimes deliberate not-knowing, perhaps, is the way to get things done."

Thursday, April 28, 2011

Sugar and Cellphones: On Scientific Challenges

Many people consider the knowledge claims generated by natural scientists to be, for the most part, quite sound. Leucippus may have speculated that atoms compose all matter, but only through the work of a slew of modern scientists—Lavoisier, Dalton, Thompson, and many others—can most of us say, with almost perfect certainly, that we know atoms compose all matter. Of course, the frontiers of science continue to challenge our assumptions about the natural world and our ability to use the scientific method to understand it—but for the most part, most people trust the scientific method and consider the knowledge it has generated quite sound.

Against this backdrop of certainty, the New York Times Magazine published two feature articles earlier this month which, despite their apparently disparate topics, offered fascinating commentary on the complexities—practical, statistical, and political—of the scientific method. The first, Do Cellphones Cause Brain Cancer?, discusses, not surprisingly, the role that cell phones may play in causing brain cancer. On its face, the article demands attention, as the ideas the author presents should be of interest to any cell phone user worldwide. But I found the article's treatment of the scientific method just as intriguing.

The author of the article, Dr. Siddhartha Mukherjee, details three ways that cancer epidemiologists can determine the cause of a particular cancer. "The first," he writes, "is arguably the simplest. When a rare form of cancer is associated with a rare exposure, the link between the risk and the cancer stands out starkly." As an example, he notes that doctors in London discovered, in the late Eighteenth Century, a remarkably high incidence of scrotal cancer among chimney sweeps. He notes that the second way is more challenging: "The opposite phenomenon occurs when a common exposure is associated with a common form of cancer: the association, rather than popping out, disappears into the background, like white noise." The common exposure-common cancer epidemiology problem baffled researchers during the Twentieth Century who could not prove a statistical link between smoking and lung cancer.

The third way that epidemiologists can determine the cause of cancer presents more difficultly still. "A common exposure... associated with a rare form of cancer," Mukherjee writes, "is cancer epidemiology’s toughest conundrum." He goes on:
The rarity of the cancer provokes a desperate and often corrosive search for a cause... And when patients with brain tumors happen to share a common exposure—in this case, cellphones—the line between cause and coincidence begins to blur. The association does not stand out nor does it disappear into statistical white noise. Instead, it remains suspended, like some sort of peculiar optical illusion that is blurry to some and all too clear to others.
Mukherjee devotes the remainder of the article to discussing the challenges scientists face in determining what is (and isn't) a carcinogen. He doesn't provide the reader with a simple response to the question he asks in the title, but he does explain why the scientific method, ever considered tried-and-true, may not provide a pathway for answering the question anytime soon. Appropriately, he ends the piece thus: "Understanding the rigor, labor, evidence and time required to identify a real carcinogen is the first step to understanding what does and does not cause cancer."

The second article, Is Sugar Toxic?, discusses the growing body of nutrition science that suggests that sugar—the sugar we find in small white packets, the high-fructose corn syrup we find in many packaged foods, and the sugar we find in fruit—is toxic to our bodies. Written by Gary Taubes, the author of Why We Get Fat: And What to Do About It, the article focuses on the work of Robert Lustig, a pediatrician made famous by a lecture he delivered in 2009, the video of which has been viewed more than one million times. I encountered a link to the video on a friend's Facebook wall about a week before I read the Times' article. 

Lustig has dedicated his career to demonstrating that sugar is toxic, that it is the leading cause of obesity and cancer. He makes a convincing argument, and while some scientists have bought it, others have not. Not surprisingly, the Sugar Association and the Corn Refiners Association have weighed in, declaring sugar safe and using evidence from the Food and Drug Administration to bolster their claim.

Although the question of sugar's toxicity is important, Taubes' discussion of the challenge that science has faced in attempting to determine the safety of sugar is equally important. Tabues details a powerful argument from Walter Glinsmann, a former F.D.A. administrator and current adviser to the Corn Refiners Association. Glinsmann contends that "sugar and high-fructose corn syrup might be toxic, as Lustig argues, but so might any substance if it’s consumed in ways or in quantities that are unnatural for humans."  Tabues rightfully asks, "at what dose does a substance go from being harmless to harmful? How much do we have to consume before this happens?"

Lustig counters this argument by noting that sugar is not an "acute toxin," one similar to others the F.D.A. regulates, those with effects that "can be studied over the course of days or months." Sugar and high-fructose corn syrup, he claims, are “chronic toxins." Lustig explains that they are “not toxic after one meal, but after 1,000 meals.” This, according to Taubes, is the great challenge facing Lustig and others who wish to prove that sugar is the cause of many common ailments.

In a way, the challenge faced by Lustig is similar to the challenge faced by cancer epidemiologists when they sought to establish a causal relationship between smoking and lung cancer. Many Americans eat excessive amounts of sugar, and many Americans have chronic diseases related to obesity. There appears to be, at the minimum, a correlation between sugar consumption and obesity. But can science prove a causal link between the two? The scientific method doesn't provide an easy way to establish one. Taubes and Mukherjee help us understand why.

Tuesday, April 26, 2011

The Magic Washing Machine

Hans Rosling—global health professor and chairman of Gapminder—makes statistical data about international development come alive. In a December 2010 TED Talk, Rosling argues that washing machines have proven vital in helping women in developing nations contribute to the progress of their societies. While this topic may not excite every viewer at first glace, Rosling brings enough creativity and energy to excite millions.



Why is this presentation so strong? Rosling knows his topic well, he's enthusiastic about addressing his audience, and he begins with a compelling personal story that immediately grabs the reader. Moreover, he uses a creative framing device—the washing machine—to bring to light some fundamental ideas about international development. And he uses not only slides but also props to enhance his message.

Another popular video that demonstrates Rosling's creativity is from The Joy of Stats, a hour-long documentary that he created with the BBC in 2010 and that is available in full online. In this five-minute clip from the documentary, Rosling dazzles his viewers once again with both his knowledge and his creative presentation techniques.



Rosling's ability both to understand social phenomena and to communicate so effectively reminds me of one of my favorite passages from Charles Baudelaire's The Painter of Modern Life. Beaudelaire writes, "Few men have the gift of seeing. Fewer still have the power of expression." We are lucky that Rosling is sharing his gifts with the world.

Sunday, April 24, 2011

Art in Nature?

This winter, when first embarking on the study of art as an Area of Knowledge, I gave my students a long list of things (e.g. a painting by Manet, a Dicken's novel, an episode of Glee, a teapot from Ikea) and asked them to classify each as art or non-art. Great discussions ensued as the students wrestled with the question: What is art?

As we touched upon each item on the list, our discussion took many paths. Towards the end of the list, I had placed the following things sequentially: (1) The Grand Cayon, (2) a picture a friend snapped of the Grand Canyon and posted on Facebook, and (3) a Thomas Moran painting of the Grand Canyon. The students immediately raised some excellent questions, not only about the difference between "high" and "low" art (Can one create art with a $120 digital camera?!) but also about art and the natural world. Can objects from the natural world, asked my students, be art? Certainly, we agreed, they can be beautiful, but are they art? Grappling with this question helped expose the complexities of defining and classifying art. 

I wish at that time I had known about the Wellcome Image Awards, which annually honor the "most informative, striking and technically excellent images" added to the Wellcome Images collection. According to its website, Wellcome Images "is the world's leading source of images of medicine and its history, from ancient civilisation and social history to contemporary healthcare, biomedical science and clinical medicine." I just came across a BBC slideshow (via the International School of Manila's TOK blog) of several of this year's winning photographs. Here are two stunning images from the slideshow:
Curled up ruby-tailed wasp
Zebrafish retina


Catherine Draycott, the Head of Wellcome Images, narrates the BBC slideshow. While the beautiful images fade in and out, she says, in part, "They look beautiful, but they're not art. There is artifice in them. Some of them have been colored, using the judgement of the scientists, wether aesthetic or scientific. So they are striking aesthetically, but they weren't created to be so. And that's one of the most interesting things about the award for me." I find Draycott's narration fascinating because she, too, seems to be struggling with the question, What is Art? She seems to believe that art is, by definition, the product of human intent, that art does not spring from the natural world. At the same time, she seems so moved by the beauty of the images that they take on the soul-touching power of art. I have no doubt that these images would have helped inform my class's discussion. I'll add them to the list.

The Science of Why We Don't Believe Science

How do we react when we encounter ideas that contradict our deepest beliefs? This question is the topic of an article in Mother Jones entitled "The Science of Why We Don't Believe Science." Science journalist Chris Mooney, a veteran of the contemporary climate change wars, presents an overview of recent findings from the natural and social sciences. Even though the article (overtly) reflects Mooney's political leanings, it also includes an exploration into the nature of knowing.

Mooney reviews recent findings by neuroscientists which suggest, using brain scans, that our emotions strongly influence our reasoning abilities, complicating the idea of pure rationality. These findings build upon the earlier works of psychologists, who have long recognized that cognitive biases—such as "confirmation bias," or the tendency of people to favor information that confirms their existing beliefs over information that refutes them—impact our reasoning. 

Mooney cites new studies, however, that demonstrate that combating confirmation bias may prove more difficult than previously thought. According to a study by Yale Law School professor Dan Kahan, presenting people with evidence contrary to their beliefs rarely leads them to change their beliefs. "In fact," Money writes, "head-on attempts to persuade can sometimes trigger a backfire effect, where people not only fail to change their minds when confronted with the facts—they may hold their wrong views more tenaciously than ever."

Mooney observes that humans have long displayed the inclination to engage with information that confirms their beliefs. He worries, however, that as technology allows us to more easily select the sources of information we consume, to cast our net into a sea of ideas that we predefine, we are giving up opportunities to encounter information contrary to our beliefs. This, he contends, presents a danger to the future of public discourse.

I don't buy into Mooney's premise about technology, at least not wholeheartedly. On the one hand, the Internet does allow individuals to create "walled gardens" in which they limit the types of people and ideas they encounter. I select my friends on Facebook and the feeds I subscribe to in Google Reader. On the other hand, the Internet allows people from across cultures, from around the world, to connect and learn from each other as never before. The development of Wikipedia entries about controversial topics (e.g. global warming) serves as a battleground for people of various ideological stripes and hints at the potential for the Internet to lead to sustained dialogue across cultural and political lines. And suggestion engines—like Google Fast Flip and StumbleUpon—may increase the likelihood that people encounter ideas with which they disagree

In the coming decades, I hope we see the emergence of more battlegrounds and fewer walled gardens online. But regardless of how the Internet develops, teachers have a special role to play in encouraring students to seek out a range of information online. As always, teachers have an obligation to expose students to a wide range of viewpoints while they sit in the classroom. But teachers also have an obligation to help students develop the desire and ability to explore ideas—even those that they may find antithetical to their beliefs—throughout their lives. This desire is the foundation of life-long learning, yet it may not, as science has demonstrated, come naturally. Nonetheless, its as important as ever to instill in our students.

Saturday, April 23, 2011

The United States of Autocomplete

At the end of last year, the Boston Globe Brainiac Blog pointed to map of the United States created by Dorothy Gambrell of the Very Small Array blog. Dorothy created the map, entitled "The United States of Autocomplete," by replacing the name of each state with the first autocomplete suggestion from its associated Google search:

Original image from the Very Small Array blog
Some of the results are rather amusing. On the map, New Jersey is labeled "New Jersey Transit" and Missouri "Missouri Compromise." Vermont, where I grew up, is labeled "Vermont Country Store," a fitting alteration for a state dotted by mom and pop establishments.

Today, the map might look quite different than it did when Dorothy created it in December 2010, and tomorrow it is likely to appear different still. The hidden algorithm that drives Google's autocomplete function is obviously dynamic—the search term "Japan" today completes with "tsunami," "earthquake" and "nuclear," which I can't imagine it did two months ago. The algorithm is also likely personalized for individual users, so long as they are signed into their Google accounts. Of course, because Google keeps its search and autocomplete algorithms secret—a reasonable move to protect its intellectual property—the public may never know for certain why the autocomplete function works they way it does.

As a teacher, who sees his students use Google searches routinely, I wonder the extent to which autocomplete suggestions—let alone search results—serve as de facto regulators of information for students who are not aware of the limitations of autocomplete. For example, here are the autocomplete suggestions for "obama" and "nazi:"

Many high school students could, I imagine, easily make sense of the autocomplete suggestions, recognizing that "obama birth certificate" reflects a contemporary political controversy and that "nazi zombies" has nothing to do with World War II. The suggestions may, however, lead younger students astray, urging them, however slightly, to associate ideas incorrectly. Teaching students to be savvy web users—to help them develop a healthy skepticism about all of the information they come across online and off—should be an important goal in classrooms everywhere.

Friday, April 22, 2011

Learning Online

This past fall, Independent Teacher Magazine published my article, "Learning Online: Supplementing Classroom Instruction with Virtual Review Sessions." Although I encourage you to read the article in full, here is the gist:
  1. For many students and parents, the value of an independent school education lies in the personal connections that students develop with intelligent and caring faculty and staff.
  2. At the same time, new technologies are transforming teaching and learning. As educators, we must recognize that the growing use of technology and the increasing availability of high-speed Internet-connections present tremendous opportunities for us and our students.
  3. Over the last several years, a number of websites have emerged that enable entire classes to meet online, unrestrained by the confines of the school building and limits of the school day.
  4. These websites, many of which I have used with students, have the potential to supplement our teaching as well as serve as key elements in school-wide emergency planning.
The article has more to offer, so check it out.

Thursday, April 21, 2011

Can video games be art?

In a recent Chicago Sun Times column, Roger Ebert defends his position that video games can never be art—a bold statement from a respected professional critic.

Ebert wrote the column, in part, in response to a TEDxUSC talk by game designer Kellee Santiago. Santiago argues that video games can indeed be art, and she provides examples that illustrate the artistic components of several games. After reviewing the criteria that people have historically used to separate art from non-art, Ebert acknowledges that art is difficult to define, yet he does come to one conclusion: Games can never be art. He writes:
One obvious difference between art and games is that you can win a game. It has rules, points, objectives, and an outcome. Santiago might cite a immersive game without points or rules, but I would say then it ceases to be a game and becomes a representation of a story, a novel, a play, dance, a film. Those are things you cannot win; you can only experience them.
I found his analysis of the artistic value of games particularly interesting because one of my Theory of Knowledge classes this year spent quite a bit of time debating this question. Many students, particularly athletes, considered soccer as much of an art as ballet. Like these students, most of the people who commented on Ebert's piece disagreed with his bright line standard between games and art. I also disagree. I didn't grow up playing video games—and I don't play any now—but I find it difficult to dismiss the artistic merits of video games categorically.

The psychological power of superstition

A post in the New York Times Phys Ed blog reminds us that, when activated, good-luck superstitions impact performance:
Activating a good-luck superstition,” the authors [of the study discussed] wrote, “leads to improved performance by boosting people’s belief in their ability to master a task.” More precisely, they added, “the present findings suggest that it may have been the well-balanced combination of existing talent, hard training and good luck-underwear that made Michael Jordan perform as well as he did.
What are your good-luck superstitions?

Wednesday, April 20, 2011

Fair Witnesses

A wonderful quote about the nature of knowing from Stranger in a Strange Land:
"You know how Fair Witnesses behave."

"Well... no, I don’t. I’ve never had any dealings with Fair Witnesses."

"So? Perhaps you weren’t aware of it. Anne!"

Anne was seated on the springboard; she turned her head. Jubal called out, "That new house on the far hilltop–can you see what color they’ve painted it?"

Anne looked in the direction in which Jubal was pointing and answered, "It’s white on this side." She did not inquire why Jubal had asked nor make any comment.

Jubal went on to Jill in normal tones. "You see? Anne is so thoroughly indoctrinated that it doesn’t even occur to her to infer that the other side is probably white too. All the King’s horses and all the King’s men couldn’t force her to commit herself as to the far side... unless she herself went around to the other side and looked–and even then she wouldn’t assume that it stayed whatever color it might be after she left... because they might repaint it as soon as she turned her back."
 Oh, to be a Fair Witness!

They get to me, too

Although I don't consider myself a militant grammarian, as a writer and as a teacher of writing, I have a number of grammar pet peeves. High on my list of annoyances sits pronoun reference errors. When a writer uses a pronoun without a logical antecedent—or when the antecedent is vague—the grammarian in me awakens, and I can barely stop myself from picking up a pen and scribbling "vague/unclear pronoun antecedent!"

Recently, I delighted in reading "They Get to Me: A young psycholinguist confesses her strong attraction to pronouns" in The American Scholar. The author, Jessica Love, discusses a number of fascinating aspects of pronoun use and history. She notes that pronouns, by definition:
only contain vague information, like first-person or plural. In order for something this vague to effectively retrieve a word’s meaning, there has to be a whole lot of context. Imagine all the words contained in your mind as a vast pool of fish. Look carefully and you’ll see that each fish is different from all the others. If you had a hook selective enough, you’d be able to control which fish you catch. But pronouns are not selective hooks. Pronouns are sweeping nets. You have to cast your net shallowly in the hopes that you catch the one noun the pronoun refers to. That’s what context does: it pushes what’s relevant to the surface of the mind.
Helping students understand the contextual nature of pronouns can be challenging. Young writers often struggle to see how anyone (especially their teacher!) could not "see" the context so clear in their minds. As students develop both the facility to use more complex language structures and the recognition that their readers' minds may lack contextual information, students become more skilled at detecting and correcting pronoun reference errors.

Love also discusses dummy pronouns, "which don’t mean anything at all." She explains:
They’re those pronouns that exist only because the English language demands that each sentence contain a subject: the it in “It’s raining” or the there in “There is a shed in my back yard.” (Note: the there only works as an example of a dummy pronoun if I am not pointing to a shed, and am nowhere near my back yard.) (Note: most linguistic examples have caveats like this, making the linguist’s life frustrating...)
After reading Love's piece, pronouns may get to you, too.

Tuesday, April 19, 2011

Shirky and Pink on Cognitive Surplus

In May, Wired published a conversation between Clay Shirky and Daniel Pink, in which they discuss the idea of cognitive surplus, the topic of Shirky's new book. I find the idea of cognitive surplus, which Shirky defines as the cognitive excesses we can spend as we please, fascinating. Shirky argues that Web 2.0 technologies have enabled us to use our cognitive surpluses to connect with each other and add value as never before:
Television was a solitary activity that crowded out other forms of social connection. But the very nature of these new technologies fosters social connection—creating, contributing, sharing. When someone buys a TV, the number of consumers goes up by one, but the number of producers stays the same. When someone buys a computer or mobile phone, the number of consumers and producers both increase by one. This lets ordinary citizens, who’ve previously been locked out, pool their free time for activities they like and care about. So instead of that free time seeping away in front of the television set, the cognitive surplus is going to be poured into everything from goofy enterprises like lolcats, where people stick captions on cat photos, to serious political activities like Ushahidi.com, where people report human rights abuses.
Shirky and Pink point to Wikipedia as a great example of what the pool of cognitive surplus can accomplish when properly connected with a framework that allows for collaboration across continents. Truly exciting times ahead!

Consumer Ethics

Is it ethical to download music illegally? To read books and magazines in a bookstore without making a purchase? To buy an article of clothing, wear it to a function, and then return it? To visit a bricks-and-mortar store in order to learn about a product, then leave the store and buy the product cheaper online?

Jason Fertig, an assistant professor of management at the University of Southern Indiana in Evansville, Indiana, poses these questions, and others, in order to help future business leaders think critically about their ethical obligations as consumers. In an article published by the National Association of Scholars, he writes:
I urge students to convince me whether these consumer behaviors signal potential unethical behavior at later times. Similar to the downloading issue, who would make the better executive, a person who respects the merchandise at a bookstore (or at least buys a coffee when spending considerable time at the store) or one that tries to justify reading GQ cover-to-cover in the store, when there is a copy at the local library that is available for free use?
Fertig notes that his aim is to help students understand that self-knowledge and self-control are fundamental elements in ethical decision making. "It is about knowing how to battle your own flaws," he writes. "Knowing when to blow that whistle on yourself when no one else will."

Monday, April 18, 2011

Derek Sivers: Weird, or just different?

In this, one of my favorite Ted Talks, Derek Sivers asks his audience to rethink basic assumptions about how we make sense of the world. Under 3 minutes long, it's worth watching:



Four philosophical questions to make your brain hurt

I just came across an article from a 2008 edition of the BBC News Magazine which raises four epistemological, metaphysical, and ethical questions for the general (and impatient) reader. The four questions are:
  • Should we kill healthy people for their organs?
  • Are you the same person who started reading this article?
  • Is that really a computer screen in front of you?
  • Did you really choose to read this article?
Fascinating questions to ponder and discuss!