Current Affairs is

Ad-Free

and depends entirely on YOUR support.

Can you help?

Subscribe from 16 cents a day ($5 per month)

Royalty reading issues of Current Affairs and frowning with distaste. "Proud to be a magazine that most royals dislike."

Current Affairs

A Magazine of Politics and Culture

We Need To Get Junk Science Out of Courtrooms

Forensic pseudoscience introduced into the courtroom has been used to convict and render death on people based on the thinnest of evidence.

M. Chris Fabricant is the strategic litigation director for the Innocence Project, as well as the author of the book Junk Science and the American Criminal Justice System. For years, Fabricant has exposed the weakness and lack of rigor in many mainstream forensic science practices that are used to convict people of crimes. From bite mark analysis to ballistics to hair and fiber evidence to blood spatter analysis, much forensic science admitted in courts of law across the country hasn’t been held to a particularly high standard of rigor, and innocent people have been sent to prison (and even possibly executed) as a result. Fabricant’s book profiles individuals who have been convicted of horrific crimes on the basis of false conclusions by forensic “experts.” He shows how worryingly few safeguards there are for making sure that if you’re accused of a crime, the proof that you did it is actually scientifically valid. Fabricant recently joined Current Affairs editor-in-chief Nathan J. Robinson on the podcast to discuss the problem. The interview has been lightly edited for grammar and clarity.

Robinson  

So how much junk science is there in the American criminal justice system?

Fabricant  

Junk science is a huge problem in the criminal justice system. It has been spreading through our system since the post-World War II era like a virus. It’s everywhere at any time. At any given moment there’s an expert witness on the stand claiming that that gun is the only gun that could have fired that bullet to the exclusion of every other bullet on the planet. There are problems with certain types of DNA testing, hair microscopy, shaken baby syndrome, old arson investigation techniques, new emerging digital technologies that have not been demonstrated to be reliable, bite mark evidence, shoe print evidence, tire tread evidence, blood spatter evidence—all these forensic techniques are used regularly. And beyond its continued use, despite the scientific community weighing in on all these techniques, we know, as a matter of science, that at least half of all wrongful convictions demonstrated through post-conviction DNA testing are attributable, at least in part, to the misuse of forensic sciences. Half! So, this is a foundational and fundamental problem with our legal system.

Robinson  

I want to zero in more precisely on what we’re talking about when we’re talking about “junk science.” You’re talking about the kinds of evidence—ostensibly scientific evidence—that are introduced in order to convict people of crimes, where you have people who are called forensic scientists whose job it is to produce evidence that can link a person to the commission of some deed. Tell us a little bit about forensic science and its distinction from other kinds of science.

Fabricant  

I defined junk science in the book as subjective speculation masquerading as scientific evidence. And what I mean by that is that there is no scientific or empirical basis for the opinion. It’s based largely on training and experience and hasn’t been demonstrated to be valid and reliable through scientific research done through the scientific method and published in peer-reviewed journals, the way that mainstream typical science works.

In forensics, what we often have as compared to mainstream science are forensic techniques and knowledge generated by law enforcement. And typically, it’s done in an ad hoc basis. It will become useful in a particular case or a particular crime. Bite mark evidence is an example that I use in the book to demonstrate how a particular form of junk science gets introduced into the legal system. But it really only takes one case—one precedent-establishing case—or one judge to allow in one technique, and it’s very, very hard to exclude that evidence forever thereafter, no matter how junky it was to begin with.

Robinson  

One of the serious problems here seems to be that both juries and judges are called upon to be the evaluators of expert testimony, and to determine whether whoever the prosecution has called is correct in their assessment, that, as you said, this bullet is the only bullet, or this bite must have come from this person’s teeth. But because judges and juries are themselves not scientific experts, you’ve created a situation where it is very, very easy to fool people or very difficult to tell junk from real science. But, to be clear, there is real scientific evidence that is introduced as well.

Fabricant  

Absolutely. Isn’t it outrageous that we have lay jurors that are relied upon to distinguish nonsense from sense in criminal courts where life and liberty is at stake? There’s no federal oversight. There’s isn’t something like the FDA [Food and Drug Administration] where consumer products like toothpaste and aspirin are tested for their safety before they’re used by the general public, because we care about the reliability of those products. We care more about the reliability of mouthwash in this country than we do about forensic sciences that are used to sentence people to death and to life in prison. This is because there’s nothing like the FDA that does validation research, outside of the criminal legal system. The only gatekeeper is a judge. And the judges, like the jurors, like most lawyers, tend to be scientifically illiterate. This is a widespread problem in this country generally. We have to have the scientists that are doing real research, and are validating these techniques, demonstrating that they’re reliable outside of the adversarial system, because within the adversarial system, lawyers are trained persuaders. They’re going to take whatever evidence is emitted, and they’re going to argue it proves their point, proves their side, and makes them win. And that type of advocacy work can twist science in ways to achieve the ends of whatever advocate is arguing for whatever result. In the criminal legal system, that’s typically a prosecutor that’s proffering evidence, typically against an indigent person of color. We have these dynamics that are playing out in criminal courts every day. Most jurors believe that virtually anybody on trial is guilty. Same is true with judges, and they’re reluctant to exclude evidence that’s helpful, in order to convict those believed to be guilty. And we talked earlier about the fallacy of that belief, and that wrongful convictions happen all the time, and half are attributable to junk science.

“We care more about the reliability of mouthwash in this country than we do about forensic sciences that are used to sentence people to death and to life in prison.”

Robinson  

Let’s get into some of the specifics. You single out forensic dentistry and bite mark analysis in the book as a particularly shoddy and unreliable form of science that has been used to convict many people. In fact, your book’s wonderful cover shows a set of human teeth gripping dollar bills to indicate the kinds of conflicts of interest that there are in the field of forensic dentistry, where dentists can make good money issuing this kind of testimony in a courtroom. So perhaps we can go through, starting with bite mark analysis and forensic dentistry, what some of these problems are.

Fabricant  

One of the really interesting things about bite mark evidence, when you’re researching it, is that you can locate the first case that was introduced in criminal courts. You could track the history of bite mark evidence in ways that you can’t really of other forensic techniques. I did additional research to try to find out where this came from. How do we get bite mark evidence in criminal courts? How did this become a thing? I tracked it all the way back to the first case, and then I filed a lawsuit against the American Board of Forensic Odontology to get into their archives, because they were excluding me from viewing the source material, their origin story, because I’m a critic, and so I had to sue to get into this archive. 

Once I got in, I found some of the original literature. There was this group of forensic dentists that had been working adjacent to forensic sciences. They were in the medical examiner’s offices identifying dead bodies of people that were John Does and Jane Does. This is fundamentally a valid technique where people are identified by their dental records, right? You hear these stories all the time. “So-and-so’s body was burned beyond recognition and was identified by the dental records.” What people don’t typically think about is, “Who does that?” It’s your friendly neighborhood forensic odontologist. This is what they call themselves when they are in court. That toehold in forensics allowed them to advocate for more of a slice of the forensic pie.

At that time, in the post-World War II era, forensic scientists were becoming stars. Shows like Quincy were becoming popular. The Sam Sheppard trial was a sensational trial that featured a lot of forensic sciences, particularly forensic pathology. The bite mark guys weren’t yet a thing, but the dentists were there, and they wanted to get into court. They wanted to be expert witnesses, and they started pointing to bite marks as a path into the expert witness game. And so there were articles openly calling and advocating for the use of this technique, even though there was no science behind it whatsoever. They got together [and] formed the American Board of Forensic Odontology. They started giving themselves credentials and board certifying each other, even though they never had to demonstrate that they could actually do anything like identify a bite mark as such or match it to somebody or exclude anybody.

Then they found the first case. It was in California, and it was in cartilage instead of on actual human skin, which is much more variable and even more difficult to interpret. And the judge in that case said in a written opinion that there’s no science here and recognized that there had not been any experimentation. But it was called such basic stuff, just matching evidence, that [it was thought that] the jury wouldn’t be hoodwinked by it and would be able to see for themselves whether or not a bite mark matches or doesn’t match. The evidence was let in. And that opinion, the Marks opinion, was the germinal case in American jurisprudence around this. It got cited all over the country by states’ high courts. It was the first time that this evidence was admitted at trial and created precedent around the country for its continued use. 

After that, they hit the jackpot. I devote a chapter to Ted Bundy. Mainstream media has been a huge problem in terms of perpetuating the myth of infallible forensic sciences. The media culture that has been built up in the post-World War II era has perpetuated this myth, and if you look back at the most well known junk sciences, much of it has been propelled by sensational trials. And bite mark evidence was central to the Ted Bundy case, because in the end, shockingly, there was really no other evidence to convict him the first time that he had been arrested right after the so-called Chi Omega murders. 

There are no eyewitnesses. There are no fingerprints. There is no hair. There’s nothing. The only physical evidence tying Ted Bundy to that scene was alleged to be bite marks. And because Ted Bundy was a madman and wanted to represent himself for aspects of his case, he himself cross-examined these dentists who had invented the whole thing. And they became national celebrities, because the case rose and fell on bite mark evidence. And after that, we’re still talking about Ted Bundy in bite mark cases today. It’s the quintessential case study.

Robinson  

In cases where it turns out the person is in fact guilty, it can appear to validate the technique, right? Ted Bundy was not innocent of the crimes of which he was accused. But the point is that it does not, in fact, logically follow from the fact that the testimony said that it could be proven that these were his bite marks. And they turned out to be his bite marks that the scientists testifying [said they] could, in fact, identify as necessarily his teeth.

Fabricant  

Broken clocks are right twice a day. You’re pointing out a really important aspect of this, which is that there’s no feedback loop, very often, for forensic scientists to understand how often they get it right and how often they get it wrong. For a long time, and still today to a large extent, the experts point to convictions as evidence of the reliability, as evidence of their error rates, and say, “Well, I haven’t had any overturned, so they must have all been correct.” The Innocence Project work demonstrates the fallacy of that logic.

Robinson  

In political science, people used to prove that welfare reform worked by showing how many people had gone off welfare. They’d say, “Look how effective it was, the welfare rolls were reduced by so much.” And you say, “Yeah, but what happened to those people’s lives? Are you actually improving people’s lives? You’re measuring it by the wrong statistic.” One of the other things that you and I had talked about earlier was the fact that if you have a field where everyone shares the same background assumptions, but the background assumptions are wrong, you can have what Richard Feynman called “cargo cult science”—something that looks like scientific practice but is actually shoddy. You have peer reviewed journals, but everyone reviews each other’s work without ever really subjecting themselves to the kind of rigor to know whether their techniques are in fact capable of proving the thing they think they can prove.

Fabricant  

Yes, it’s a fundamental problem with forensics. A lot of it goes back to what we were talking about earlier in terms of the difference between mainstream science and forensic sciences. One of the other examples that I point to in the book is arson investigation. Arson investigation, like bite mark evidence, hair microscopy, blood spatter evidence, firearms analysis—many of these techniques really operate in essentially a guild-like structure. The masters of the trade have the received wisdom that is passed down from mentor to mentee, generation from generation. A lot of it is folklore. What I mean is that it sounds science-y, and there are big textbooks, and there are leading practitioners of the field who become very high flying and high paid consultants. But it’s just never been tested.

We have over thirty wrongful convictions attributable to arson techniques that the Texas Forensic Science Commission and the National Academy of Sciences have pointed out had been bogus. These alleged arson indicators are actually just received wisdom that happens to have been wrong and was passed down from generation to generation, with all of the bells and whistles of science, but none of the meat.

Robinson  

Could you explain how this might look in practice? It’s like a witness says, “I can look at this pattern of the dirt here, and I can tell that it was made by this kind of scuffing of these kinds of pants, or I can look at these marks on the wall from this fire and discern this.” Perhaps you could give us some specifics as to what the kind of testimony would be and why it’s not, in fact, nearly as reliable as it looks and sounds when that person is explaining that there is a connection between this piece of evidence and the conclusion that they draw.

Fabricant  

One of the analogies that I like to use is when you’re out on a field, and you and your friend are looking at the clouds, and you point to one and say, “Hey, doesn’t that cloud look like an elephant?” “Yeah, it does look like an elephant.” You see the trunk, you see everything is [like an elephant]. And then when you see the cloud, you can’t unsee the cloud, because that’s been suggested to you that that’s what it looks like, and then you can see it. You may never have come to that conclusion on your own, but you have another person that is making this suggestion as a way to interpret what you’re seeing. Take that into the forensic realm and look at something like an injury on human skin. You have somebody that’s been declared, in front of the jury, an expert by the judge. They call themselves a forensic odontologist. They say they’ve done hundreds of these cases and they’re a diplomate of the American Board of Forensic Odontology and Chief Forensic Odontologist of the State Medical Examiner’s Office. Lots of fancy credentials. They say, through my training and experience, I can see here that this is the upper bicuspid, and this is the central incisor that made this mark. I’m orienting this bite this way and these marks from the lower teeth over here. It’s a very scientific-looking photograph, and it’s been taken with great precision. There will be a right-angle ruler on it that will show that it’s done to scale, and it’s a proper photograph.

What you don’t really get is that that dentist has no idea if that’s a bite mark or not, no more than you or I would. His or her guess is as good as yours or mine is. But you can dress it up and give this kind of suggestion to a lay juror, and it’s very, very compelling. In the book, I wrote about how I sat with Keith Harward’s brother. Keith Harward served 34 years in prison for rape and murder which he did not commit. His brother, who believed in his innocence, went to court every day of his trial. The day that Keith Harward was exonerated, I was sitting with him, and we were talking about the trial. And he said, during the course of the trial, that he began to doubt his own brother’s guilt when the forensic odontologist has testified. It was so compelling that he began to ask himself, “What are my little brother’s teeth doing on that woman’s thighs?” That’s how compelling it was.

We know that junk science is powerful enough to convict the innocent, and you can see it at work. The same was true with arson indicators and hair microscopy. You look at these, and what they were identifying is so-called pour patterns that looked like somebody had taken lighter fluid or something like that, squirted it on the floor, and lit it on fire. And if you’re looking at burn patterns in the ground, and you have an expert that’s saying that is a pour pattern, if you’re a juror with no experience in this area, how are you going to critically evaluate that? You’re going to accept what the experts say.

Robinson  

I wrote about this subject a few years ago, citing your work, for the Boston Review and got a reply from some forensic scientists. The first thing I noticed was that after their name, they listed about ten different initials for credentials they had. And I, being someone who was cynical towards highly credentialed people, thought how ridiculous that you puff yourself up that way instead of using actual arguments, proof, and evidence, of which they had none to refute anything that I said about their field. But I can imagine [what it’s like] if you are called to evaluate the innocence or guilt of a person accused, and someone gets on the stand, and they say, “I am trained professionally to analyze hairs. This is a hair that was found at the crime scene. These are the hairs of the suspect. I have all of these credentials for the American One Hair Analysis blah blah blah Society, and I can confirm to you that this person’s hair was at the crime scene.” I mean, what are you to do? 

Fabricant  

If you accept the testimony, the defendant is guilty, right? The litigation that I do, and that the lawyers in my department do at the Innocence Project, is post-conviction cases where this type of evidence has been used, and those are the cases I write about in Junk Science. What is really, really heartbreaking and enraging is that, in many of these cases, experts—to their credit—have recanted. Say that thirty years have gone by. The progress of science has led me to the inescapable conclusion that what I testified to in a trial thirty years ago was wrong, as a matter of science, and I’m recanting my opinion. Courts, however, don’t credit this.  We have a client right now whose appeal we lost on Friday. [The court] is saying that the expert witness recantation didn’t matter because for the juror, going back to the original bite mark case, this stuff is so simple that lay jurors themselves will be able to tell the difference between a match and not a match. So, the expert witness recantation doesn’t matter. And moreover—and we get this a lot—[they’re saying that] it wasn’t that important to the trial. This so-called scientific testimony didn’t really matter. [But] we all know science sells. Only sex sells better than science. Every ad that you see for every piece of consumer product that you can think of [claims to be] scientifically tested. They say that because it sells. And in court, for a lay juror who’s looking for the truth, who wants to get it right, you offer them so-called scientific evidence from somebody that’s highly credentialed, somebody that’s supposed to be arm’s length from the case and not have an axe to grind with any particular person. Now that’s a truth that is being offered that no other type of evidence can be offered so that we continue to allow speculative evidence in as scientific evidence [and it] invites wrongful conviction. One of the reasons I wrote the book was to try to change the narrative in popular culture around the use of this evidence. 

Robinson  

The book is about junk science, but we can distinguish between junk science that is just a type of science that cannot be done, that is essentially pure cargo cult science, where if you’re looking at the clouds, it just doesn’t work. It might feel like it works, but it’s not been tested. It’s about as good as the people who think they can move spoons with their mind, right? But we can also have this category that’s the real scientific evidence badly applied or not held to standards of rigor, where you might have the field having been tested and generally the type of analysis is possible, but the person doing the analysis is not reliable. Could talk about the distinction between those two kinds of categories?

Fabricant  

Yes. Fingerprint evidence is a really good example of this. If you talk about the spectrum of trace evidence, on one end you have bite mark evidence, which can never be done reliably under any circumstance. That should never have been a thing. Fingerprints, on the other hand, are generally reliable, much more so than some of the other trace evidence techniques. But you can take what is basically a reliable technique and make it pretty junky, depending on how much information that you have or how much you don’t have. One of the real problems with fingerprints and forensics, generally, is that there aren’t any standards nationally for, well, anything in particular—even as a threshold issue as to how much information you need in a latent fingerprint to make a so-called match. We know, today, that fingerprints have not been demonstrated as a matter of science to be unique. I think they probably are. I’m not arguing that they aren’t. But we don’t know this as a fact. What’s more important in forensics is that we don’t know how similar two fingerprints are. When you’re talking about latent fingerprints, these are smudges at crime scenes. If we don’t have any standards for how much information in that smudge you need, then you get a real problem with creating a potential wrongful conviction, a false positive, because some fingerprint experts will be willing to make a match based on very little information. What we get in that type of situation is the influence of cognitive bias on that conclusion. All forensics have a certain amount of subjectivity, some much more than others. Fingerprints are no different in that there aren’t any measurements being taken here to say that we need to have within a measurement of uncertainty, that when we declare a match that we know exactly what that means as a measurement, that this came within our one-millimeter degree of confidence in this measurement on this loop and this loop on this fingerprint. We don’t do that; it’s eyeballed.

When that happens, we have the potential for the influence of cognitive bias, which is when irrelevant information influences a subjective decision. Very famously, Brandon Mayfield was wrongfully arrested for the bombing of the commuter train in Madrid, Spain in 2004. And that was based on an FBI fingerprint match on the plastic bag where the blast caps were found. It was a poor-quality print. The problem with the case was that, in addition to the low quality of the print, Brandon Mayfield happened to be Muslim. He happened to have represented somebody that had once been convicted of providing material aid to a terrorist organization. He was married to an Egyptian national. So, the FBI is like ding ding ding, that’s the guy. And two other forensic experts, both highly qualified fingerprint experts, both confirmed the wrong match. Both said that yes, we agree that’s Brandon Mayfield’s match. And it wasn’t until the Spanish authorities identified the actual perpetrator that the FBI finally backed off. That sent alarm bells through the fingerprint community because there were not so many high-profile wrongful convictions or wrongful arrests or false positives in the fingerprint world. 

Dr. Itiel Dror, cognitive neuroscientist at University of London, did a really important experiment after that, a very clever experiment. He took a group of forensic experts, all board-certified fingerprint experts, and gave them casework and asked them to do evaluations on this case work. But what was clever was that he didn’t tell these experts that this was their own prior casework, and that they already came to conclusions on these particular sets of latent and exemplar matches. The only thing that changed was the contextual information included in the case files that point one way or another, like there was an eyewitness or a confession, something like that. And three-fifths of the experts involved in that experiment changed their original conclusions based on nothing but contextual information. The evidence was exactly the same. Fingerprint evidence is a relatively reliable technique if you have a good quality imprint. We don’t know what the definition of high quality is. It would be good if we had that. But you take a low-quality one, you could turn something reliable into junk science.

Robinson  

And how do you know that in any given case it’s being done well, and reliably? When I was at the New Orleans public defender, I was kind of shocked because I was sitting in on a hearing about fingerprint evidence, and the New Orleans fingerprint examiner was on the stand. He was asked how he knew whether what he was doing was reliable, and how he could determine a match. And he said, well, because you look at one fingerprint, and you look at the other to see if it matches. But you think, “Yeah, but how do you know how correct you are about this match? What’s your likelihood of being wrong?” The very concept of having a likelihood of being wrong and an error rate just was foreign to the fingerprint examiner, which was really shocking to me. But as you point out, that’s happening all around the country all the time. 

You’ve mentioned the bad financial incentives and the cognitive biases that people can have. It’s also the case that, as you pointed out, there’s this very close tie between forensic scientists and law enforcement, where they’re kind of an arm of law enforcement rather than independent scientists. Could you talk about how that kind of strange mixture of the role of scientist and criminal investigator for the state affects what happens?

Fabricant  

Sure. In 2009, the National Academy of Sciences issued a monumental report on the state of forensics in the United States, and it looked at thirteen different forensic techniques and the use of forensic sciences generally. It’s the first time that mainstream scientists have looked at forensic science in a really systemic way and given it a critical review. The National Academy of Sciences is the most respected and prestigious scientific entity in the United States and perhaps the world. These people know what they’re talking about. Their number one was that you have to separate crime labs from law enforcement, that science is science. There shouldn’t be prosecutor science [and] defense science; there should be science. We know [where] the phrase “cops in lab coats” came from: so many detectives have so-called graduated into the crime lab without getting advanced degrees in science and being introduced and indoctrinated into the culture of mainstream science. In the post-World War II era where forensic science became a thing, and a forensic scientist was a career that you could aspire to, what attracted folks to that field had been people that were not necessarily attracted to science, per se, but were interested in law enforcement and crime solving, in catching bad guys and advancing justice. These are largely well intentioned, civic minded folks who got into this field, but they really don’t see an issue with being an arm of law enforcement. And you have some of the biases that we talked about. If you talk to folks that are in crime labs, they are often very resistant to blinded analysis. They want to know what happened in the case and they think that it’s important to their work. And so that type of close relationship, essentially an arm of law enforcement, coupled with a law enforcement mentality, creates a useful tool for prosecutions but creates bad science. And good science is objective. Bad science is good at getting convictions but is not as credible as it could be if it were separated from law enforcement.

Robinson  

Your book really humanizes and profiles the victims of junk science. We’ve been talking about bad scientific practice, and we’ve been talking about why people believe it. Your work at the Innocence Project is focused on finding people who have had their lives completely ruined by this stuff. So perhaps we could, as we draw to a close here, highlight what’s at stake in getting this right versus getting this wrong.

Fabricant  

I got back from New Orleans yesterday, and I was visiting my client who’s on death row in Angola penitentiary. His entire case rests on bite mark evidence that has been discredited for years and years. We don’t have slam dunk DNA in the case, and it makes litigating, even in a death case, extremely challenging. And I have three clients on death row right now that were all put there by bite mark evidence. I have another client that is on death row due to microscopic hair comparison evidence. The stakes could not be higher. 

The first client story that I tell at the beginning of Junk Science is the Keith Harward story. What is chilling about that story is how close he came to death in his first trial after his conviction for capital murder. His parents got on the witness stand and begged for his life. The jury decided to spare him. Had he not been spared, undoubtedly, he would have been executed. We would never have heard his story.

My paralegal researched bite mark cases in 2012 when we decided to search for anybody convicted on bite mark evidence. He came across this appellate opinion and handed it to me. This is the first story I tell in the book. Reading this appellate opinion and being skeptical around bite mark evidence, you’d have been like, God, he sounds innocent.

What’s enraging to me about when we talk about the stakes and the continued use of junk science is that we tend to get the science right in civil litigation, when money is at issue. And we tend to get the science right when consumer products are at issue. But we don’t care about the science that is used to decide the life and liberty of— overwhelmingly—poor people and disproportionately Black and brown people in this country. We just don’t care that the stakes could not be higher. This was another chapter devoted in my book to “poor people science,” and the reason that I call forensics that.

Robinson  

And we know, of course, also that at least one person, Cameron Todd Willingham, has actually been put to death on the basis of this kind of testimony.

Fabricant  

Yes. In Junk Science, I write about what I argue are four wrongful executions. Cameron Todd Willingham is the best known of those cases. He was one hundred percent wrongfully executed on the strength of the totally discredited, so-called arson expert testimony that was debunked, posthumously, in an over 800-page report from the Texas Forensic Science Commission. Claude Butch Jones was executed on the strength of hair comparison evidence and circumstantial evidence that may have put him at the scene but not inside the liquor store where the proprietor was shot. Hair comparison evidence put him inside the liquor store. Posthumous testing of that hair evidence showed that it was not his. There was also David Wayne Spence, also a Texan, also convicted and sentenced and put to death on bite mark evidence. There’s Tommy Lee Walker, also in Texas. Polygraph evidence was used to coerce a confession out of him. He also was put to death. So, there have been wrongful executions in this country. Undoubtedly. Those are just four that I happen to write about. They all rest on the power of junk science to persuade credulous jurors into convicting the innocent and condemning them to die.

Robinson  

It’s pretty important to get this stuff right. It really is astonishing how little rigorous attention is paid to the problem of junk science and criminal punishment. When I wrote my article years ago and I was looking for sources, most of the ones were you, because you’re the guy who writes about this. There was that huge National Academy of Sciences report. But there’s little attention paid to the question of whether all of the testimony that is being used to take people’s lives away from them is founded in real sound logical inference—or in something that looks like it but is cargo cult-y. One wishes there would be a lot more of a discussion than there really is.

Fabricant  

Perhaps my book will change the narrative a bit. Certainly that was why I wrote it.


Transcript edited by Patrick Farnsworth.

More In: Interviews

Cover of latest issue of print magazine

Announcing Our Newest Issue

Featuring

Celebrating our Ninth Year of publication! Lots to stimulate your brain with in this issue: how to address the crisis of pedestrian deaths (hint: stop blaming cars!), the meaning of modern art, is political poetry any good?, and the colonial adventures of Tinin. Plus Karl Marx and the new Gorilla Diet!

The Latest From Current Affairs