Science isn’t some static body of dusty facts—it’s a method of inquiry into the unknown, a structured way of thinking about problems.
After all, scientists aren’t like investigators, they are investigators…which is why we PIs can probably learn a thing or two from the scientific disciplines. It’s no accident that they’re called “disciplines,” after all—the natural sciences offer us systematic techniques to hone our thinking and leave our assumptions behind.
With that, a few thoughts on what scientists can teach us about thought itself:
1. Question the “facts.”
There’s a common perception that Science=Fact, an Unimpeachable Construct not subject to the vagaries of faith, error, or human emotion. But science, like any human endeavor, is limited by the flaws inherent in the human mind.
Science isn’t just a body of knowledge, a set of facts, or an explanation of the physical world. It’s not some passive, dusty stack of tomes; it’s active—the collection and interpretation of evidence, a creative formulation of questions and hypotheses, a careful and systematic study of the unknown.
Certain “facts,” after all, are more fluid than we realize. Einstein was pretty darned sure the universe had always been the same size it is now. For a moment in 1989, a couple of scientists knew they’d achieved cold fusion. And that same year, New Yorkers—and Americans—were convinced that five black and Latino youths had beaten and raped a jogger in Central Park. The five had confessed, after all, and a signed confession is Fact with a capital “F,” right?
Apparently, the smartest guy of the 20th century was wrong about something. Hopes for cold fusion went into the deep freeze when the experiment couldn’t be reproduced. And the Central Park 5 were exonerated after spending their youth in prison. Their confessions, it turned out, were obtained through lying and coercion. Worse yet, the case seemed to confirm some widely-held beliefs about crime, class, and race; at some point the Central Park 5 ceased to be individual kids accused of a crime. They became symbols—much like the Duke lacrosse team did in 2006-7—and we humans do love to summarily convict symbols of anything we especially fear or loathe, all evidence aside.
Turns out, it matters very much how we obtain the pieces of evidence that, when corroborated and triple-checked, eventually become hard facts. It matters what questions we ask ourselves before the investigation begins, how open our minds are as we make observations, and what judgements we make as the observations and evidence stack up.
Behind most facts, there’s a story, or possibly a few caveats.
Take witness testimony: Jurors routinely convict based on witness’ statements, although eyewitness accounts are notoriously unreliable. In fact, one researcher in the department of Criminology, Law and Society at the University of California at Irvine says that memory is so malleable that it’s possible to create wholly convincing recollections through simple suggestion. As we know well, it’s not only what a witness says that matters; it’s why (s)he says it.
Like any investigative field, science is just as flawed as the humans behind it. The best scientists know this, which is why they’ve established a rigorous discipline to govern the ways they think, experiment, and interpret. It’s called the scientific method, and we investigators are authorized to adapt it to our needs as best we can.
That method can help us separate the hard, cold facts built on solid foundations from the softer, more malleable “facts” made entirely of cards.
2. Even smart people are biased.
Unfortunately, we’re not as rational as we think we are. We make assumptions, take mental shortcuts, and harbor biases we don’t even know about.
As a kid, I must have heard the “doctor riddle” dozens of times—on the first telling, at least, it exposed people’s assumptions (including mine) about the gender of surgeons. Now there’s a slightly more scientific way to shed light on our hidden biases: The Implicit Association Test is a computer-based assessment that asks users to respond quickly to a series of words or images that relate to race, weight, age, and even gender roles in various careers.
Each test takes about 5 minutes to take, and they can be painfully illuminating. If you dare, you can try them here.
We’d like to think that intelligence and education can protect us from prejudices and errors in thinking, but a 2012 study in the Journal of Personality and Social Psychology suggests otherwise: Smarter people may actually make more cognitive errors. Even worse: the scientists who led the study (Richard West and Keith Stanovich) found evidence that knowing about our own biases doesn’t necessarily help us to overcome them.
The problem is, they suggest, that whatever causes our biases stems from the unconscious mind—a place to which we have little access. But there’s good news, too: Although we may not be able to consciously outsmart our biases, there’s a workaround—the scientific method. “We have human perspectives, and they bias the way we look at the universe,” said astronomer Natalie Batalha in an interview on the public radio program On Being. “But if we stick to the observations, it’s a method of removing that human perspective and, when we do so, amazing things happen.”
3. Be ready to see the thing you’re not looking for.
One of biggest dangers to good science and good investigation? Confirmation bias—the tendency to see only the evidence that backs up our hypothesis (and ignore the facts that don’t).
In 2011, Harvard University found rock-star primatologist and psychologist Marc Hauser guilty of scientific misconduct, asserting that he’d manipulated and even created data in his study of primate learning.
And there’s a chilling analog to Hauser’s story of science fraud unfolding this month in Texas: The startling case of former Williamson County DA Ken Anderson (now a judge), accused of prosecutorial misconduct in Texas.
In 1987, Anderson prosecuted a man named Michael Morton for murdering his wife Christine. Anderson and Williamson County sheriff Jim Boutwell were so convinced of Morton’s guilt that they ignored evidence that suggested an intruder committed the crime: a bloody bandana found near the scene; neighbors’ reports of a strange van parked nearby and a man watching the victim’s house; a detailed account by the victim’s (and Morton’s) 3-year-old son of “a monster who hurt mommy.”
Worse, Anderson hid some of this evidence—most likely exculpatory—from Morton’s defense team. Pamela Colloff’s gripping two-part series in Texas Monthly details the case, from the crime and conviction to the Innocence Project‘s investigation that freed Morton in 2011…and the arrest of Christine Morton’s real killer.
Wrongful convictions aren’t nearly as rare as they should be. But are Morton’s conviction, and Hauser’s bad science, the result of deliberate fraud, or of confirmation bias?
In the science world, there’s tremendous pressure to publish, and publication requires a startling result; thus, bias may favor the exciting result over the mundane. Similarly, intense public pressure may induce law-enforcement and prosecutors to close cases quickly, and even to fight the introduction of new evidence on appeal that might overturn a conviction—which Anderson did, vehemently, in Morton’s case. Maybe Anderson so wanted to see facts that pointed to Morton, and Hauser so wanted to see evidence that his primates could learn, that that’s all they could see.
The moral? Investigators, scientists, and all manner of truth seekers have to be honest with themselves about what incentives may be influencing their thinking. Does the client want “the moneyshot” at all cost? (As one client of ours crassly, and semi-jokingly, demanded.) You can guard against that pressure by stipulating that moneyshots are not guaranteed, and payment won’t depend on the gathering of such.
4. Discovery is joyful…and rare.
“Understanding is a form of ecstasy,” writes Carl Sagan in Broca’s Brain, Reflections on the Romance of Science. But in science, and in our work as investigators, the ecstatic aha moment is hard-won, and often elusive. Sometimes, what we want to find just isn’t there, and it can’t be manufactured.
But “in the fields of observation,” as Louis Pasteur said, “chance favors the prepared mind.” There’s no “Eureka!” insight in Archimedes‘ bathtub or under Newton’s apple tree without all the mathematics and physics study that came before. The Innocence Project worked for years to overturn Michael Morton’s conviction and the prosecution fought back at every turn. The odds for Morton looked slim, and the work must have seemed futile at times.
In this wonderfully funny TED talk of 2007, James Watson tells the story of how he and Francis Crick discovered how DNA stores and copies information. In April 1953, the final version of their double-helix model with its matching base pairs occurred to them in a flash of insight: “If it didn’t work this way, you might as well believe it,” he jokes, “because you didn’t have any other scheme.
“But that’s not the way most scientists think,” he adds, to the audience’s great delight. “Most scientists are really rather dull. They say, ‘We won’t think about it until we know it’s right.’ but we thought, ‘It’s at least 95 or 99 percent right, so think about it!'”
So Watson and Crick did think about it, and they eventually agreed with themselves…and won a Nobel Prize.
Watson wasn’t being serious. But what struck me about his quip was this: He didn’t say, “We’re 99 percent right, so let’s go ahead and publish it.”
He said, “We’re 99 percent right, so think about it.”
Think about it. In other words, let’s keep working on it until we know. Eureka!