At the heart of a recent evaluation of the use of forensic evidence in criminal cases is the question, “Do you have data to support your claims?” says John Butler, director of forensic science at NIST. Too often the answer is no, according to a September report by the President’s Council of Advisors on Science and Technology (PCAST).

In Forensic Science in Criminal Courts: Ensuring Scientific Validity of Feature-Comparison Methods, a PCAST working group chaired by Eric Lander, president of the Broad Institute of Harvard University and MIT, makes recommendations on how to improve the validity and reliability of forensic methods used in the US legal system. Even before the report had been officially released, word filtered out that the Department of Justice was balking at the recommendations.

Two developments in the early 1990s raised the bar on standards for expert testimony: the introduction (by Butler and others) of DNA analysis as a new forensic science and the Supreme Court ruling on Daubert v. Merrell Dow Pharmaceuticals. Under Daubert, if forensic evidence is to be admissible in court, it must be based on empirical scientific data, not merely on the established procedures of its practitioners.

In 2009 the National Academy of Sciences found shortcomings in particular with forensic methods that compare evidence from a crime scene with samples from a suspect. The PCAST report follows up by considering analyses of complex DNA mixtures, hair, latent fingerprints, shoe prints, bite marks, and firearms—in which examiners try to determine the provenance of a bullet or whether ammunition is associated with a specific gun.

This 3D Scan, obtained using a confocal microscope, shows the impressions that the breechface of a gun left on the soft metal of the primer at the base of a fired cartridge case. The false color represents depth of deformation, in microns. This type of firearms identification is one of several feature-comparison methods discussed in a recent report by the President’s

This 3D Scan, obtained using a confocal microscope, shows the impressions that the breechface of a gun left on the soft metal of the primer at the base of a fired cartridge case. The false color represents depth of deformation, in microns. This type of firearms identification is one of several feature-comparison methods discussed in a recent report by the President’s

Close modal

Many forensic disciplines do not have an underlying foundational validity based on empirical data, the report says. For example, it finds no foundational validity to bite-mark analysis. And it points to a review by the DOJ and the Federal Bureau of Investigation that found testimony based on hair analysis with optical microscopy to be misleading in more than 95% of the trials considered.

The report recommends that NIST prepare an annual report looking at the foundational validity of forensic feature-comparison methods. “Our intention is not that NIST have a formal regulatory role with respect to forensic science,” the report says, “but rather that NIST’s evaluations help inform courts, the DOJ, and the forensic science community.”

Another recommendation urges NIST to take a leading role in transforming three feature-comparison methods examined in the report from subjective into objective methods: analysis of latent fingerprints, firearms, and DNA from complex mixtures. Untangling DNA evidence from, say, many handprints on a steering wheel can be difficult or impossible, explains Butler, whereas identifying someone from a blood sample or two people from evidence in a rape case is relatively easy.

The PCAST report also recommends that the FBI undertake a vigorous research program in forensic science and its practice, that the Office of Science and Technology Policy coordinate a national strategy for forensic science R&D, and that judges at all levels take scientific criteria into account in decisions about accepting expert testimony.

“Most physicists would be shocked by the lack of systematics for quantifying errors,” says University of Maryland theoretical physicist S. James Gates Jr, a member of the PCAST working group. “Justice means you convict the guilty but exonerate the innocent. Science ought to have something to say in this.” Unfortunately, he continues, “what is presented in the courtroom as science often falls short of what a scientist would call science. We need to do better.”

The scientific measurements can be made, says Butler. They involve materials science, chemistry, computer simulations, machine learning, statistics, genomics, and the handling of large data sets. The challenge, he says, is to translate the measurements into methods that can be reliably implemented. Gates adds that it can be tough to convince forensic investigators, police officers, judges, and attorneys to switch to new approaches. “People are resistant to change. We have to engage them in the process.”

In a statement to the Wall Street Journal on 20 September, Attorney General Loretta Lynch wrote that the DOJ “believes that the current legal standards regarding the admissibility of forensic evidence are based on sound science and sound legal reasoning. We understand that PCAST also considered the issue of certain legal standards, alongside its scientific review. While we appreciate their contribution to the field of scientific inquiry, the Department will not be adopting the recommendations related to the admissibility of forensic science evidence.”

Gates, emphasizing that he can’t speak for PCAST, says he is disappointed in the DOJ response but is hopeful that “continued dialog will help resolve any misunderstandings.” He adds that his interactions with most of the forensic science community have been encouraging.

Common forensic analysis on drugs, the growing area of digital forensics, and provenance of documents are examples of forensic science not included in the PCAST report, whose scope was limited to methods based on feature comparisons.