Unconvincing Guilt: The State of Modern Forensic Science

By Chris Honeycutt

In February 2011, I wrote a short piece on the uncertainty of DNA evidence for the NYADP blog on the Times Union website*.  This article is evolved from the subsequent discussion with a former board member of the International Association for Identification.

What we learned is that even “hard” evidence is not nearly as certain as the public or juries believe it to be.

The false-positive rates of DNA testing are largely unknown (Thompson et al, 2003); and some forms of evidence, such as ballistics, writing analysis, and hair identification, can have error rates approaching a terrifying 40-100% (Saks and Koehler, 2005.)

Fingerprints, which have been trusted as hard evidence in criminal investigations for well over 100 years, are also based in questionable science.   In January 2002, Judge Louis H. Pollak of the U.S. District Court for the Eastern District of Pennsylvania found that fingerprinting failed what’s known as the Daubert standards, standards set up in 1993 by the Supreme Court regarding the admissibility of scientific evidence in court (Cho, 2002.)  Recent preliminary studies suggest fingerprints can have an error rate of 5-10% (Langenburg et al, 2010), and the true error rate of fingerprint analysis is completely unknown and untested (US Dept. of Justice Reports, 2009.)

 

Department of Justice Recognizes the Flaws in Forensic Evidence

When fingerprinting failed admissibility standards, it created tremendous problems for the justice community.  A series of congressional hearings were held, culminating in the passing of the 2006 Science, State, Justice, Commerce, and Related Agencies Appropriations Act which, among other goals, provided funding to a committee selected a committee created by the National Academy of Sciences.

In April 2009 the committee released the report “Strengthening Forensic Science in the United States: A Path Forward,” cited above as US Dept. of Justice Reports, 2009 and hereafter as “SFS 2009”. The 350 page report lists dozens of examples of spurious science used in forensics, and hundreds of examples of faulty lab technique and fraud.

Some of these include commonly trusted lines of forensic evidence, such as fingerprints.  Contrary to public perception, fingerprint analysis is largely done by unaccredited amateurs in external labs.  66% of fingerprint ID’s are done outside of traditional crime laboratories [pg. 64, SFS 2009].  External crime labs each received, by mean average, 2,780 cases a year, but only 15 percent are accredited [pg. 64, SFS 2009].  Experts within the field such as Mnookin have noted “a general lack of validity testing for fingerprinting; the relative dearth of difficult proficiency tests; the lack of a statistically valid model of fingerprinting; and the lack of validated standards for declaring a match” [pg. 106, SFS 2009, citing Mnookin].  Furthermore, forensic entities outside crime laboratories do not participate in accreditation systems and are not required to do so [pg. 200, SFS 2009].  Yet, despite a lack of testing and validity of claims, fingerprint experts on the stand routinely testify that the error rate of fingerprinting is approximately zero [pgs. 103-104, SFS 2009].

Among the sciences in question are toolmarks, tracks, bite marks and hair analysis, which are central to many rape trials.  “The fact is that many forensic tests—such as those used to infer the source of toolmarks or bite marks—have never been exposed to stringent scientific scrutiny.”  [pg. 42, SFS 2009]  Other forensic sciences which have been called into serious question by the report include polygraphs, bloodstain pattern analyses, footwear, tire track impressions, firearms, dental evidence and even corner cause-of-death analysis.

Laboratory error and falsification is also a source of significant errors and miscarriages of justice.  A state-mandated review of analyses conducted by West Virginia State Police laboratory employee Fred Zain showed that the convictions of more than 100 people were in doubt because Zain had repeatedly falsified evidence in criminal prosecutions [pg. 44, SFS 2009].

See Through His Eyes: The Case of Ron Williamson

In 1982, Ron Williamson was a physically healthy 29-year-old minor league baseball player.  One night in December, some miles from where he lived, Debra Sue Carter was brutally raped and murdered.

Because Ron was having personal problems and acting strangely at the time of Debra’s death, the police brought him in for questioning.  Two polygraphs taken while questioning him about the murder were shown to be inconclusive.  The police kept him on file as a POI (person of interest) in the case until 1987, when questioning prison inmate Glen Gore about the case.  Glen placed Ron inside the club the night Debra was killed.

At the trial, the prosecution relied on “hard evidence” – hair evidence recovered from the murder scene.  A forensic scientist testified that the hairs matched Ron’s hair, and Ron was convicted of murder – and sentenced to death.

Ron’s mental health began to deteriorate while in prison.  Caught in a Kafkaesque nightmare, his pre-existing psychiatric conditions worsened substantially and he was placed on Thorazine, a powerful and toxic antipsychotic.  He would scream about his innocence, and the guards would taunt his claims.  Five days before he was scheduled to be killed by the State of Oklahoma, Williamson managed to secure a new trial.  The Innocence Project filed to have his DNA tested for submission to the new trial.

The DNA of the rapist did not match Williamson.  After twelve years of screaming alone in a dark basement about his innocence, he was exonerated.  Broken, tired, and sick from the long years behind bars, he died only five years later (Innocence Project; PBS Frontline.)

 

 

Take-home Message

The purpose of this article is twofold.  First, there is a myth both inside and outside the courtroom that forensic evidence is foolproof.  Television shows such as CSI confirm this myth to the general public, and provide a distorted view of the rigor our criminal justice system.

But the second reason is to reinvigorate the fight against the death penalty.  Just because Troy Davis had no physical evidence against him, it is faulty to assume that the man in the cell next to Troy was genuinely guilty because he did have physical evidence against him.

Even DNA evidence is faulty, and there has been little research into the most common source of DNA error, which is laboratory testing. For example, in 1993 Timothy Durham was convicted based on a DNA match due to a type of laboratory error known as a “false-positive.”  Had Durham been executed for his crime, the error would never have been found, and an innocent man would have been killed.

Fingerprint error rates are unknown, and may go into the double digits.  DNA lab error rates are unknown, and a single faulty technician can taint the validity of hundreds if not thousands of cases.  Because we cannot be certain of a man’s guilt even by the most certain methods science has to offer, it is crucial that we continue to fight the death penalty.

References and Notes

*The NYADP Blog at Times Union: http://blog.timesunion.com/kaczynski/

Note: Over 50 articles on the questionable quality of forensic science have been published in the eminent journal Science in the last decade alone.  For further articles on the science of forensics, feel free to contact NYADP or the author at cebey1@uic.edu.

[SFS 2009]: Committee on Identifying the Needs of the Forensic Sciences Community, National Research Council, 2009. “Strengthening Forensic Science in the United States: A Path Forward.”  Report to the Department of Justice.  Document Number 228091.  National Criminal Justice Reference Services (https://www.ncjrs.gov/index.html)

(Langenburg et al, 2010): Langenburg, Champod, Genessay, and Jones, 2010 [draft]. “Informing the Judgements of Fingerprint Analysts Using Quality Metric and Statistical Assessment Tools.” Preliminary publication for research funded by grant SC-10-339 from the Midwest Forensic Resource Center, Iowa State.

(Saks and Koehler, 2005): Saks and Koehler, 2005.  “The Coming Paradigm Shift in Forensic Identification Science.” Science Vol. 309 No. 5736.

(Thompson et al, 2003): Thompson, Taroni and Aitken, 2003.  “How the Probability of a False Positive Affects the Value of DNA Evidence.”  Journal of Forensic Science Vol. 48 No. 1.

(Cho, 2002): Cho, 2002.  “Judge Reverses Decision on Fingerprint Evidence.” Science, Vol. 295 No. 5563.

PBS Frontline: Burden of Innocence. http://www.pbs.org/wgbh/pages/frontline/shows/burden/

The Innocence Project. http://www.innocenceproject.org/

Advertisements