Posted on 03.07.12
Feature written for Wired.co.uk for their Turing Week.
I spoke to Noah Goodman and Michael Frank – cognitive scientists at Stanford University – about the first mathematical model of shared context between speakers and why it’s the key to making computers better at conversation.
“Our best AI systems right now tend to be employed by companies as phone-answering services,” says Michael Frank, head of Stanford University’s Language and Cognition Lab. “But try explaining to these things that you want to cancel a fraudulent charge on your credit card.” He laughs. “You can see there’s a long way to go. They’re weak at understanding what you said, but they’re weakest at going from what you said to what you actually meant.”
When communicating, context is king. Meaning is often conveyed as much by what isn’t said as by what is. For instance, the proposal “Would you like to go for a coffee?” can be turned down with the words “Ihave to return some library books”, even though, taken out of context, those words do not constitute a rejection. But computers can be literal to a fault, with such indirectness lost on them.
Frank and colleague Noah Goodman, also a cognitive scientist at Stanford, have developed a mathematical encoding of what they call “common knowledge” and “informativeness” in human conversation. “We have a vastly powerful predictive model of the world,” says Goodman. “When somebody goes to understand a statement that somebody else has made, they’re making the best guess about the meaning of that statement, incorporating all these factors like informativeness and context.”
Read the whole interview at Wired.co.uk.
Posted on 03.07.12
Written for Summer 2012 issue of I, Science magazine.
Do the words we use determine the world we perceive?
If I say to-mah-toe and you say to-may-toe, to call the whole thing off would be rash. But if I say tomato and you say la tomate, we might be living in subtly different worlds. Do your tomatoes have something in common with cars (la voiture) and houses (la maison) and the Moon (la Lune) that mine do not, simply in virtue of the gender of your words? Lera Boroditsky, Assistant Professor in the Department of Psychology at Stanford University, thinks so. In a series of studies exploring linguistic relativism, she claims to have shown that “even small flukes of grammar, like the seemingly arbitrary assignment of gender to a noun, can have an effect on people’s ideas of concrete objects in the world”. 
In one experiment, for example, Boroditsky and colleagues asked German and Spanish speakers to describe a key – an object named by a masculine noun in German and a feminine noun in Spanish. Boroditsky found that German speakers were more likely to say “hard”, “heavy”, “metal”, and “useful”, while Spanish speakers favoured terms like “golden”, “intricate”, “little”, and “lovely”, which would appear to suggest that German speakers see keys as having more masculine qualities than Spanish speakers do. It is a fascinating claim, but a controversial one. “Unfortunately,” says Professor Gabriella Vigliocco, of University College London’s Division of Psychology and Language Sciences, “lack of replication is a major issue in this area.”
“Language is a fundamental feature of how we think about the world,” says Vigliocco. “Language tells us a lot about what concepts there are, and how we should conceive of the world in general.” But her own findings in experiments with English and Italian speakers do not support the idea that linguistic gender has a conceptual effect – despite admitting that it would be “far more interesting” if they did. “The fact that in Italian ‘the fork’ is feminine doesn’t make the fork more female-like, so to speak, than in a language that doesn’t have a gender,” she explains.
According to Vigliocco, linguistic gender comes into play in such experiments only once speakers are prompted to come up with characteristics for an object. “People use whatever resources they have [available] in order to solve such tasks, and if the language has gender, why not?” she says. “If your language were to divide objects into male or female, then that’s an obvious way to go about classifying them.”
The issue is complicated. “Of course language is playing a fundamental role in shaping our cognition,” she says. “On the other hand, it’s also not as black-and-white or as simple as saying ‘oh, this language has two genders, male and female, so for these speakers all the things that have a masculine gender are going to be more male-like than the things that have the female gender.’”
Of particular interest are the more abstract concepts “related to society, politics, religion, and so forth” that we can only access through language– things we cannot experience with our senses. “We learn to categorise the internal world via language,” she says. Our grasp of more abstract concepts may be more susceptible to linguistic influence than concrete ones. Boroditsky, for example, has also looked at whether our perception of time is affected by the way we talk about it. Though there are again difficulties with replication, Vigliocco is more open to that possibility. “In my work, I always claimed that effects of language on cognition should be assessed one by one,” she says. “But this does not exclude the possibility of an effect for time.”
Brain-imaging techniques are now being used to complement behavioural experiments, which might open up new lines of research. “It’s a two-way street,” she says. “Especially if you are looking at how other systems – like perceptual systems – work and are affected by a specific language.” By monitoring what goes on in the brain while people speak, we can learn how words are processed mentally. New techniques might also help with the replication issues of behavioural work. “They are different experimental techniques that really go hand in hand,” she says.
But what of our initial question? Does the way we speak really affect our perception of the world? “I think there is good enough evidence now that language can affect cognition under some conditions,” she says. “So, really, we are beyond asking whether there is some form of relativism or not. Yes, there is. However, this does not also imply that language is the only force sharing our cognitive make-up. Our culture, our physical environment, and our bodies also play a critical role in how our cognition is shaped.”
 “How Does Our Language Shape the Way We Think?”, in Max Brockman (ed.) What’s Next? Dispatches On the Future of Science.
Posted on 20.10.11
Dennis Ritchie (standing) and Ken Thompson
The world has lost two giants of technology in as many weeks.
The news that Dennis Ritchie died on 12 October, aged 70, after enduring cancer and heart disease for several years, elicited a quiet response. Ritchie was the creator of the C programming language and one of the co-inventors of the Unix operating system, which means we’re living in a world Ritchie helped to invent.
“When Steve Jobs died last week, there was a huge outcry, and that was very moving and justified”, said Rob Pike, a colleague of Ritchie’s, speaking to Wired. “But Dennis had a bigger effect, and the public doesn’t even know who he is.”
Pretty much all of the daily interactions we have with technology owe something to Ritchie’s creations 40 years ago. The internet is built on Unix, from the server farms behind Google and Amazon to the router through which you’re locally connected. Your TV probably runs an operating system based on Unix. As does your Mac, your iPhone, your iPad – OS X and iOS are built on BSD, a Unix variant – and of course any Linux machine and Android device.
Then there’s C, the language in which Unix – and a vast amount of other software from the core of Windows to MATLAB – is written. And when software isn’t written in C there’s a very good chance it’s written in a language descended from, or heavily influenced by, Ritchie’s creation, whether C++, Java, or C#.
Unix was developed by Ritchie and Ken Thompson at AT&T’s Bell Labs in the 60s after the project they had been working on – an ambitious multi-user operating system known as Multics – was dropped by the company for being too complex. Young, idealistic, and stubborn, Ritchie and Thompson decided to build a simpler, streamlined version of the operating system by themselves: Unics – or Unix – a pun on Multics was the result.
C was designed by Ritchie initially as a means to an end in developing his new operating system, but its versatility and ability to be compiled to different computer architectures quickly made it an enormously useful tool. The C Programming Language, the book Ritchie wrote with Brian Kernighan, setting out the standard definition of the C language, has become a classic.
Due to its official status as a telecoms monopoly, AT&T was at first unable to enter the computer industry and thus unable to market this new operating system it suddenly had its hands on. So Ritchie and Thompson simply gave their creations away to friends and colleagues in universities, who used them to teach a generation of programmers, engineers, and computer scientists.
Even more crucially, Ritchie’s initial free dissemination of Unix and C led to the free software movement. When AT&T eventually wriggled itself into a position where it could make money from Unix, MIT researcher Richard Stallman started making a free version of Unix under the GNU (Gnu’s Not Unix) umbrella. GNU – along with its Linux kernel and myriad satellite utilities – is at the core of everything open source.
It’s so easy to forget about the countless, unsleeping machines behind the slickness of today’s interactions with technology, but Ritchie lives on in the hearts of most of them. Now’s a good time to spare a thought for both them and him.