IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Forensic Science Falls Short of Public Image

"CSI effect" creates ideal expectations in an imperfect world.

On a Las Vegas morning, crime investigator Gil Grissom surveyed the scene of an apparent suicide by a wealthy casino heir, dusting for prints, looking for fibers — any clue to help him and his team find the true story. Through drug analysis, fiber testing and close contact with the police, Grissom determined that the dead man was killed by his brother, who hoped to get a piece of their family’s fortune. Grissom was right. The brother confessed to the crime and was sent to prison.

Close observers were led to the conclusion that crime labs can do remarkable things. And sometimes, they can. But this story wasn’t reality. It was an episode of the television show CSI: Crime Scene Investigation. In real life, crime scenes don’t always yield compelling forensic evidence and analysts don’t always catch everything. Juries, however, have come to expect that they do.

This may seem like a minor problem. It is not minor. "They see a lot of this stuff on TV in the last 10 years, all these 'CSI' shows, and they think we can pull a rabbit out of our hat,” said FBI Special Agent Doug Seccombe at a recent panel discussion. “And it's not like that.”

In reality, as opposed to TV, crime scene investigators and crime labs are overworked and under-funded. This has led to backlogs of untested evidence, created problems with preserving evidence once it’s collected, and in the latest high-profile crime lab scandal, led a Massachusetts chemist to falsify thousands of lab samples.

This August, drug chemist Annie Dookhan was charged with obstruction of justice after she confessed to falsifying drug samples at the state’s Hinton Crime Lab in Boston. Already, at least 20 people have been released from jail because the drug evidence in their case was handled by Dookhan. Still, more than 1,100 people are currently serving jail time based on evidence she certified.

For an industry that’s held up on television shows like CSI as practically infallible, Dookhan’s testimony was shocking. In interviews with police, she admitted to identifying samples by sight rather than performing the required tests to determine if they were in fact illegal drugs. Dookhan also forged other analysts' signatures on lab reports and communicated directly with assistant district attorneys, crossing the hard line between scientific investigation and prosecution.

“I screwed up big time, I messed up, I messed up bad, it’s my fault,” Dookhan admitted in an interview with police, according to court documents.

The fallout from the faulty drug tests has already taken a toll on the state’s public safety budget. Boston Mayor Thomas Menino has requested $15 million from Governor Deval Patrick to cover the cost of supervision for newly freed drug convicts. Patrick has asked the Legislature to appropriate $30 million to the case for the upcoming year. The state drug lab has been closed indefinitely, and any new drug samples collected are being outsourced to private labs for testing.

Blame has also fallen on the Massachusetts Department of Public Health, which managed the lab where Dookhan worked. Even after other analysts shared their suspicions about Dookhan’s work with their supervisors, reporting that she processed more than 500 samples per month when an average analyst was processing between 50 and 150, no action was taken against her. Governor Patrick has appointed the state’s inspector general to investigate what went wrong in the department, and the l\Legislature’s Joint Committee on Public Health will hold hearings on both the Dookhan case and ongoing problems at the state drug lab on November 28.

Faulty Forensics?

Few crime lab scandals so clearly involve an individual bad actor, but whether the fault is traced to an individual or a system, the results are roughly the same: Cases have to be retried, convictions are overturned and millions of taxpayer dollars are spent trying to clean up the mess.

Massachusetts is not the only state to experience a recent problem in its crime lab. Earlier this year, the drug lab in St. Paul, Minnesota, was shut down after problems developed with possible evidence contamination. The state lab had to take over the case work. Michigan’s state crime lab faces the same problem. Since the Detroit crime lab closed in 2008, the Michigan State Police lab has been handling all the forensic evidence collected at Detroit crime scenes, as well as trying to work through 11,000 untested rape kits that were discovered in the Detroit lab before it closed.

The ramifications extend far beyond the labs themselves. Before evidence is admitted into a trial record, a judge must determine whether the evidence is scientifically valid and evaluate its relevance to the case. In 1993, the U.S. Supreme Court affirmed that judges are in fact the gatekeepers against “junk science” in the courtroom and identified them as the final arbiter of what constitutes valid scientific evidence.

This is a problem, says Judge Donald Shelton, a trial court judge in Michigan’s Washtenaw County and author of several books on forensic evidence. Many, if not most judges, lack the skill to evaluate forensic evidence properly.

“Many judges don’t have (flawed forensics) on their radar yet, and our judicial education is spotty from state to state,” says Shelton. “We, as judges, owe it to ourselves to become much better informed about the current state of forensic science.”

In fact, the whole field of forensic science is currently in flux, following a top-to-bottom review in 2009 by the National Academy of Sciences. The report cast major doubt on many common forensic techniques, calling them unscientific and error-prone.

Specifically, bite mark analysis, where perpetrators are identified by matching a mold of their teeth to bite marks found on a victim’s body, was found to be entirely unscientific and subject to an individual examiner’s interpretation. Another common technique, analyzing hair evidence, was found to be ineffective at producing any individual match, although it can potentially narrow the field of suspects to people who share certain hair characteristics, like color, hair-shaft form or length.

Educating Judges

While the results have shaken up the forensic science community, Judge Shelton says that the effect on the courts hasn’t set in yet. “One of my concerns, “he says, “is that these forms of evidence that we know from the National Academy of Sciences report aren’t valid, are still routinely offered and routinely admitted by judges.”

The problem courts face is that juries now demand forensic evidence before they can be convinced that the police did their job investigating a crime. This is what many judges and prosecutors are coming to call the “CSI effect,” the  expectations of a jury that all trials will have foolproof scientific evidence establishing guilt or innocence. Shelton, who has conducted multiple studies on the effects of forensic evidence on juries, calls it a broader “technology effect,” where jurors want to see forensic evidence as a result of the advances in science and technology they use in their own lives.

This effect of forensics on cases is all the more magnified during the plea bargaining process, where forensic evidence, whether solid or faulty, can be enough to convince defendants to plead guilty to a crime they may or may not have committed.

“Particularly in low-level crime, like in the drug cases now in question in Boston, the likelihood that the prosecutor says ‘This is cocaine’ and the defendant can’t prove otherwise, that means they’ll plead,” says Jennifer Laurin, a law professor at the University of Texas at Austin. “The simpler the crime,” Laurin says, the more likely that “forensics are going to be dispositive.”

But for all the faults in forensics and the crime lab process, prosecutors warn strenuously against disregarding certain forensic evidence or techniques as a reaction to a few high-profile wrongful convictions. “It’s ridiculous to say that fingerprint evidence isn’t reliable,” says Scott Burns, head of the National District Attorneys Association. “Or that tool marks, ballistics, or all those pieces of evidence shouldn’t be introduced. In any trial, each piece of fact is going to be assessed. If we throw all that evidence out, there’s not going to be too many criminal trials in this country.”

But carefully evaluating forensic evidence in criminal cases, even at a time when courts are under pressure from budget cuts and high caseloads, is fundamentally important, Shelton says. “In civil trials, it’s only money at stake. In criminal cases, we are going to lock someone up or execute them based on that evidence.”

A Way Forward

It would be unfair to say states have ignored the conclusions of the National Academy of Sciences report. They are moving, albeit slowly, to provide more oversight of forensic labs and build in protections against wrongful conviction. In many cases, the states that have seen major problems in their crime labs have developed the most elaborate systems of oversight.

Texas, which arguably has the most sophisticated forensic oversight of any state, is no stranger to crime lab scandals. In 2002, a Houston TV station aired an investigative report exposing mishandling of DNA evidence in the lab there, and after the lab’s own investigation, all DNA testing was suspended indefinitely. Over the next five years, investigators reviewed 3,500 cases and found that analysts had fabricated test results, lost track of evidence and allowed a roof leak to contaminate DNA samples. So far, two convicts have been exonerated based on the findings of the investigation.

As a result of the problems revealed in Houston, the state began to create an infrastructure to deal with wrongful convictions and provide avenues to investigate claims of misconduct in the state’s crime labs. In 2005, the Texas Legislature created the Texas Forensic Science Commission to investigate any and all claims of misconduct in the state’s labs. Its jurisdiction was later limited to cases where evidence was tested or introduced on or prior to September 1, 2005.

Texas also has conviction integrity units, both in the Dallas district attorney’s office and at the state’s Court of Criminal Appeals, the highest state court for criminal cases. These units, which also exist in the district attorney’s offices in Manhattan, Detroit (Wayne County) and Chicago (Cook County), are made up of attorneys and judges who reevaluate any case where there are allegations of misconduct, both in forensics and in other aspects of the investigation. In Texas, the Legislature appropriates $18 million every two years for continuing legal education for the criminal justice system in new research and practices to avoid wrongful convictions

“We’re educating everyone on these topics so that we’re all on the same page,” says Judge Barbara Hervey, who sits on the Court of Criminal Appeals and heads the Criminal Justice Integrity Unit. “Everybody needs to know something about different forensic sciences and the big issues about admitting scientific evidence.”

So far in Texas, the commissions seem to be having the desired effect. The Texas Forensic Science Commission has evaluated more than 40 complaints and made many recommendations to the state’s crime labs, all for about $250,000 per year. Because Texas has a system in place to deal with potentially wrongful convictions, it is equipped to deal with a crisis of the magnitude of the Boston crime lab problem in a way Massachusetts was not.

“If we got (a situation) in the door that had 30,000 cases, it would be hard, but we could handle it because that’s what we do,” says Lynn Robitaille, chief counsel for the Texas Forensic Science Commission. “One thing that you learn is how to prioritize cases and how to conduct a retesting program that makes sense and has the greatest possible impact without bankrupting the entire system. If you’re doing it for the first time with a crisis of that magnitude (like Massachusetts), it’s going to be very, very difficult.”

Reprint courtesy of Stateline, a nonpartisan, nonprofit news service of the Pew Center on the States that provides daily reporting and analysis on trends in state policy.
 

Wayne E. Hanson served as a writer and editor with e.Republic from 1989 to 2013, having worked for several business units including Government Technology magazine, the Center for Digital Government, Governing, and Digital Communities. Hanson was a juror from 1999 to 2004 with the Stockholm Challenge and Global Junior Challenge competitions in information technology and education.