Lie detecting software created on real world court data

A unique lie detecting software is being created using real world court data, i.e. material from past trials where some of the speakers are truthful and others are not, which researchers from the University of Michigan say is so far up to 75% accurate in identifying who is trying to deceive and who is not, compared with human scores of slightly more than 50%.

Project leader, Rada Mihalcea, professor of computer science and engineering, and colleagues explained that it is not like a polygraph or lie detector test – it does not need to touch the subject in order to spot liars.

Their prototype considers the speaker’s gestures as well as his or her words. So far, the scientists say it has revealed several signs linked to lying. People who are lying tend to move their hands more, they try to sound more certain, and look their questioners in the eye a bit more compared to those presumed to be telling the truth.

Lie detecting software developed using court case videosSome sample screenshots showing facial displays and hand gestures from real-life trial clips used in this study. 1. Deceptive trial with forward head movement (Move forward). 2. Deceptive trial with both hands movement (Both hands). 3. Deceptive trial with one hand movement (Single hand). 4. Truthful trial with raised eyebrows (Eyebrows raising). 5. Deceptive trial with scowl face (Scowl). 6. Truthful trial with an up gaze (Gaze up). (Image: International Conference presentation)

Software could be used in many fields

Team member, Mihai Burzo, assistant professor of mechanical engineering at UM-Flint, believes the system could one day be a helpful tool for juries, security agents, and perhaps even mental health professionals.

The researchers developed the software by using machine-learning techniques to train it on a set of 120 video films from media coverage of real trials.



Some of their video footage came from The Innocence Project website, a non-profit legal organization committed to exonerating wrongly convicted people.

Prof. Mihalcea says the ‘real world’ aspect of their work is one of the main ways it is different.

Only real world situations used

Prof. Mihalcea explained:

“In laboratory experiments, it’s difficult to create a setting that motivates people to truly lie. The stakes are not high enough.”

“We can offer a reward if people can lie well – pay them to convince another person that something false is true. But in the real world there is true motivation to deceive.”

Telling lies or the truthAs shown on this graph, head shaking and an open mouth are more linked to people who are telling the truth, while liars tend to scowl and nod their heads repeatedly.  (Image: International Conference presentation)

The videos include testimony of both witnesses and defendants. In half of the clips, the speaker is deemed to be lying. To determine who was being truthful and not, the team compared their testimony with the trial verdicts.

Software analyzes people words and gestures

To carry out the study, the researchers transcribed what people said in the videos, including vocal fillers such as ‘uh, ah, and um.” They then gathered and analyzed data on the subjects’ use of various words and categories of words.

They also counted and listed the gestures used by the speakers in the videos, using a standard coding scheme for interpersonal interactions that scores nine different movements of the hands, mouth, brow, eyes and head.

They then fed the data into their system and waited for it to sort the videos. When using input from both the speaker’s gestures and words it was 75% accurate in determining who was lying. This is a significantly superior score than slightly better than a coin-flip, which is what most humans achieve.

Prof. Mihalcea said:

“People are poor lie detectors. This isn’t the kind of task we’re naturally good at. There are clues that humans give naturally when they are being deceptive, but we’re not paying close enough attention to pick them up.”

Spotting liarsHumans are not good at spotting liars.

“We’re not counting how many times a person says ‘I’ or looks up. We’re focusing on a higher level of communication.”

Common liars’ behaviours

The following common behaviours were found among people who are lying:

-Scowling: or grimacing of the entire face. Thirty percent of liars on the videos did this, compared to 10% of the truthful subjects.

– Looking: seventy percent of liars looked directly at the questioner, compared to 60% of those telling the truth.

Hand gestures: 40% of liars gestured with both hands, compared to just 25% of the truthful subjects.

Vocal fillers: such as ‘eh, um, ah’ were much more common among the liars.

Distancing: those who were lying distanced themselves from the action with words like ‘she’ or ‘he’, instead of ‘we’ or ‘I’, and used phrases that reflected certainty.

Scientists integrating physiological and cultural factors

Prof. Burzo said:

“We are integrating physiological parameters such as heart rate, respiration rate and body temperature fluctuations, all gathered with non-invasive thermal imaging.”

They are also trying to factor in the role of cultural influence.

Prof. Burzo added:

“Deception detection is a very difficult problem. We are getting at it from several different angles.”

In this study, the scientists classified the gestures themselves, rather than let the computer do it. The computer is currently in the process of being trained to do that.

Research fellows Mohamed Abouelenien and Veronica Perez-Rosas were also involved in this study. A paper on their findings – ‘Deception Detection using Real-life Trial Data’ – was presented at the International Conference on Multimodal Interaction. It is published in the 2015 conference proceedings.

The researchers wrote in the Conclusions section of their paper:

“To our knowledge this is the first work to automatically detect instances of deceit using both verbal and non-verbal features extracted from real trial recordings. Future work will address the use of automatic gesture identification and automatic speech transcription, with the goal of taking steps towards a real-time deception detection system.”

The study was funded by the Defense Advanced Research Projects Agency, the John Templeton Foundation, and the National Science Foundation.

Citation: Deception Detection using Real-life Trial Data,” Verónica Pérez-Rosas, Mohamed Abouelenien, Rada Mihalcea and Mihai Burzo. International Conference on Multimodal Interaction. DOI: 10.1145/2818346.2820758.

Video – How to spot a liar