Science & technology | Artificial intelligence and psychology

The computer will see you now

A virtual shrink may sometimes be better than the real thing

ELLIE is a psychologist, and a damned good one at that. Smile in a certain way, and she knows precisely what your smile means. Develop a nervous tic or tension in an eye, and she instantly picks up on it. She listens to what you say, processes every word, works out the meaning of your pitch, your tone, your posture, everything. She is at the top of her game but, according to a new study, her greatest asset is that she is not human.

When faced with tough or potentially embarrassing questions, people often do not tell doctors what they need to hear. Yet the researchers behind Ellie, led by Jonathan Gratch at the Institute for Creative Technologies, in Los Angeles, suspected from their years of monitoring human interactions with computers that people might be more willing to talk if presented with an avatar. To test this idea, they put 239 people in front of Ellie (pictured above) to have a chat with her about their lives. Half were told (truthfully) they would be interacting with an artificially intelligent virtual human; the others were told (falsely) that Ellie was a bit like a puppet, and was having her strings pulled remotely by a person.

Designed to search for psychological problems, Ellie worked with each participant in the study in the same manner. She started every interview with rapport-building questions, such as, “Where are you from?” She followed these with more clinical ones, like, “How easy is it for you to get a good night’s sleep?” She finished with questions intended to boost the participant’s mood, for instance, “What are you most proud of?” Throughout the experience she asked relevant follow-up questions—“Can you tell me more about that?” for example—while providing the appropriate nods and facial expressions.

Lie on the couch, please

During their time with Ellie, all participants had their faces scanned for signs of sadness, and were given a score ranging from zero (indicating none) to one (indicating a great degree of sadness). Also, three real, human psychologists, who were ignorant of the purpose of the study, analysed transcripts of the sessions, to rate how willingly the participants disclosed personal information.

These observers were asked to look at responses to sensitive and intimate questions, such as, “How close are you to your family?” and, “Tell me about the last time you felt really happy.” They rated the responses to these on a seven-point scale ranging from -3 (indicating a complete unwillingness to disclose information) to +3 (indicating a complete willingness). All participants were also asked to fill out questionnaires intended to probe how they felt about the interview.

Dr Gratch and his colleagues report in Computers in Human Behaviour that, though everyone interacted with the same avatar, their experiences differed markedly based on what they believed they were dealing with. Those who thought Ellie was under the control of a human operator reported greater fear of disclosing personal information, and said they managed more carefully what they expressed during the session, than did those who believed they were simply interacting with a computer.

Crucially, the psychologists observing the subjects found that those who thought they were dealing with a human were indeed less forthcoming, averaging 0.56 compared with the other group’s average score of 1.11. The first group also betrayed fewer signs of sadness, averaging 0.08 compared with the other group’s 0.12 sadness score.

This quality of encouraging openness and honesty, Dr Gratch believes, will be of particular value in assessing the psychological problems of soldiers—a view shared by America’s Defence Advanced Research Projects Agency, which is helping to pay for the project.

Soldiers place a premium on being tough, and many avoid seeing psychologists at all costs. That means conditions such as post-traumatic stress disorder (PTSD), to which military men and women are particularly prone, often get dangerous before they are caught. Ellie could change things for the better by confidentially informing soldiers with PTSD that she feels they could be a risk to themselves and others, and advising them about how to seek treatment.

If, that is, a cynical trooper can be persuaded that Ellie really isn’t a human psychologist in disguise. Because if Ellie can pass for human, presumably a human can pass for Ellie.

This article appeared in the Science & technology section of the print edition under the headline "The computer will see you now"

Back to Iraq: Getting it right this time

From the August 16th 2014 edition

Discover stories from this section and more in the list of contents

Explore the edition

Discover more

Elon Musk’s Starship reaches orbit on its third attempt

Though it failed to return to Earth, it’s a step nearer to the stars

A flexible patch could help people with voice disorders talk

It would convert vocal-cord movements into sound


New York City is covered in illegal scaffolding

Machine learning algorithms could help bring it down