Do the “Eyes” Really Have It? Why Eyewitness Testimony Is More Fallible Than You Think

peephole

Eyewitness testimonies are usually the basis of our favorite crime-comedy films, perhaps no more hilariously parodied than in the Oscar-winning classic, My Cousin Vinny. While eyewitnesses may not be as laughable and easily dismissible in real life, the reliability of their testimony is susceptible to similar scrutiny.

Since the advent of forensic DNA testing in 1985, almost 78% of the first 130 convictions in the United States have been overturned (according to the New York-based public policy organization, The Innocence Project). Studies show that majority of exoneration cases are due to mistaken eyewitness accounts. Like any CSI-episode will tell you, DNA testing is pivotal to any crime scene. The biological material — like skin, hair or blood — is now considered the most reliable physical evidence at a crime scene: it identifies a given individual with almost 99% accuracy.

Still, forensic psychologists have long been wary of applying this level of scientific certainty to eyewitness accounts. We’ve trawled through the research papers, and have aggregated the reasons why — have a look.

 

Distance makes the eyes go fuzzier

As common sense tells us, the further away you are from an object, the less clear it appears. So what is the exact distance required to clearly see a perpetrator’s face? Perception and cognition psychology professor, Geoffrey Loftus, developed a mathematical tool that shows exactly where in space our vision becomes blurry.

In one of his studies, he gave a group of people with 20/20 (or perfect) vision a series of small, unrecognizable images of famous people, which included the likes of Michael Jordan, Julia Roberts and George W. Bush. He gradually enlarged the images until the participant was able to recognize the celebrity. Converting the image size to actual distance, he analyzed the degree of blurring which can make a face recognisable to a healthy pair of eyes.

From this, he developed a formula which calculated how much vision a healthy onlooker loses at different distances in broad daylight. The further away an image, the more it shrinks in size on the retina. Consequently, the image becomes less decipherable.  At 10 feet, you may not see a person’s eyes; at 500 feet you can merely see a blurred face.

 

Line ‘em up and get it wrong

The most popular way crime investigators record in eye witness account is lining up potential suspects. Typically, the eyewitness is given instructions during the lineup. However, witnesses can be influenced by unintended bias. One study (published in the journal Law and Human Behaviour) showed participants a video of a crime including four culprits. They were then showed four series of six people, with the instruction that the series may or may not include a perpetrator. The study found that this specific instruction alone increased the number of misidentifications.

 

Quacks aren’t quacks after all

One way to weed out bias is to create a “double-blind” lineup.  If the person administering the lineup knows which of the suspects is the culprit they can inadvertently show bias during the procedure. In this case, the lineup administrator has no knowledge of the true suspect in the line.

Another scientific trick is to time how long it takes for the witness to identify the culprit in a lineup. Research shows a witness’s opinion is often accurate the quicker it takes them to identify the perpetrator. It’s what psychologists call the “10 to 12” second rule. Studies show if a witness takes less than 10 to 12 seconds to identify the suspect, they are accurate 90% of the time; the accuracy reduces to 50% if they take longer.

In the US, New Jersey, North Carolina and Wisconsin have incorporated reforms to better validate an eyewitness testimony. While the eyewitness accounts will never be perfect, it’s encouraging to know forensic psychology is paving the way to make it better.

 

Your Turn: Are there any studies that attest to the precision of eyewitness testimony? In other words, have you found proof that some what we’re saying just isn’t true? Let us know in the comments. We’d love to hear from you.

Comments

comments

2 Comments

  • Paul says:

    I am writing a crime novel which involves the discovery of a skull of a person who died maybe 25-35 years ago being hidden in a collection of ancient skulls in a church crypt. This skull needs, eventually, to be recognised as relatively modern just by someone looking at it and having his/her suspicions raised. Would there be any visual clues that might highlight it as ‘different’?
    Thank you.

    • Hi Paul

      We have an article on skull identification here (http://bit.ly/1z0zRVu). Skull dating is difficult to do without lab work because the state and decomposition of a body can be different depending on how the skeleton was prepared, buried or interned. As far as visual identification, modern skulls of the Homo Sapien have basically been unchanged for over 100,000 years, which feature the large brain chamber and high forehead. The ancient skulls of Homo Erectus (Neanderthals) have not been around that long, they were common up to a million years ago – so those would not be stored in an ancient church, due to the two different time scales of ‘ancient’. There are differences in skulls of different races, different sexs, and certain pure tribes in remote locations, but, once again, I think you want the skulls to all be of a same population of western man.

      You could have your ‘special’ pocess very slight differences from the population of skulls in the church – for example, the others would all be bleached white, completely bare of tissue and remnants – and the one you want your hero to notice could have a different shade – maybe a stain, maybe nose cartilage or a few hair particles that would imply it had a different afterlife than the others. Dating is usually done in labs, looking for DNA in marrow or carbon. Those differences would not be obvious, even to the most trained eye.

Leave a Reply