Over the past couple of months, I’ve seen a wave of stories focused on why some people distrust science. In fact, there have been so many stories published that keeping track of the latest information has been a challenge.
I’ve read a lot of them, and what we’re learning about how and why people reject science is truly interesting. It seems we’ve assessed the current state of trust, and clearly defined the problem and its origins – but what are we supposed to do with this new information?
A few of the articles ‘sort-of’ offer suggestions for how communicators can operationalize the research, but the recommendations are too broad. For example, several pieces said we should help scientists develop better communication skills and get more of them to engage with the public.
Wait…I thought that was science communication 101?
What have we learned?
Since you probably haven’t had a chance to read more than a few of the recent articles, I’ll share some highlights from a few the pieces cited at the end of this post:
- Media loves controversy. Media will give equal time to anti-science non-experts, and label it ‘balance.’ Media also favors stories focused on shocking/amazing discoveries, but rarely takes the time to give insight into the years of work and teams of scientists involved in the research.
- We believe (or refuse to believe) scientific facts because we want to fit in. Even when faced with seemingly indisputable facts, the need to be accepted by peers and to belong will trump science every time. People view science through lenses tinted by their community, their friends, their religion, their job, etc. The good news is that even the most hardened opinions can be softened by family, friends, and colleagues who truly understand the culture fostering the disbelief. Relationships are important.
- The deficit model still doesn’t work. Trying to change the mind of a disbeliever by spouting off a list of facts is likely to make the person defensive. This may limit his ability to consider facts that may seem contrary to his ideology. Surprisingly, distrust in science doesn’t seem to be reserved only for the uninformed observer. Research by Dan Kahan of Yale University has found that the strongest opposition to scientific topics is seen in people who already have a good understanding of the science.
- Kahan also emphasizes the importance of ‘disentagling’ science messages from the ‘cultural baggage’ we all carry. (see bullet #2)
- Political party affiliation is not what drives trust/distrust in science. Both liberals and conservatives support/dismiss scientific evidence, but they focus on different topics. And ‘high-profile’ scientific debates may diminish overall trust in all science.
- Chastising those who disagree with generally accepted scientific facts isn’t effective. Finding common ground by identifying things we all agree on is a better way to start the conversation.
What are we supposed to do with this?
What we’re learning is important, but I’m struggling to identify ways in which I can apply it to the strategic marketing and communication activities at my university. I suspect other university communicators may feel the same. That said, there are are some very practical things we can do, which may help maintain – and potentially enhance – public trust in science:
1. Be honest…even when it’s difficult.
Science communication is more important than people may think. The information we share impacts lives, industries and reputations. It also influences policy. We’ve got to get it right. It’s not easy to admit when we don’t, but we have an obligation to do it.
This reminds me of a man who challenged authority and went public (in a big way) after finding problems with his team’s research…because it was the right thing to do.
Don Grace’s* most embarrassing moment occurred in the spring of 1989. The University of Utah announced that one of its chemists, along with a co-researcher from the University of laboratory, had achieved cold fusion in the laboratory.
Within two weeks, researchers at the Georgia Tech Research Institute (GTRI) confirmed the wonderful news. “We had the media in and had a great presentation,” Grace ruefully said.
The trouble was, the [GTRI] experiment was fatally flawed due to temperature-related instrumentation errors. The Tech team, following the original experiment’s same protocols in an effort to duplicate the results, arrived at the same wrong conclusion. Only four days after their experiment, the GTRI researchers, led by Dr. James Mahaffey, detected the error.
“Then came time to admit that we were wrong,” Grace recalled in a 2006 interview. Universities are notoriously averse to admitting mistakes in public. Grace said that a number of people strongly advised against going to the media again. But he didn’t take that advice.
“Of course it would be embarrassing,” he continued. “Jim and I blushed the whole time. But we did something that was incorrect and we had to face up to it and get on with it. It was the right thing to do.
*Don Grace was Director of GTRI from 1976-1992
The bottom line – if you mess up, you need to fess up. It may wind up enhancing your credibility in the end.
(See a video of Don Grace sharing memories of this experience.)
Refuse to play games with research news
We work hard to avoid releasing stories that overstate research results. With every university competing for attention, and I can see how making a story appear ‘sexier’ could be appealing. Remember that you’re putting your reputation, and that of your institution, at risk if you decide to roll the dice.
We also go to great lengths to ensure everything we release to the public is as accurate as possible. This includes reviewing academic papers, interviewing researchers and working directly with our compliance office to confirm appropriate research protocols were in place, and conflict of interest and regulatory issues have been addressed. We also examine research contracts to identify if there are public release exemptions and/or sponsor review requirements. The researchers are asked to review stories for accuracy prior to release, and research sponsors are often asked to approve the text and photos we plan to share publicly.
Our process requires a lot of pre-planning. And on rare occasions, we’ve had to kill stories we’ve spent a lot of time on when approvals didn’t come through.
Is this a flawless system? No. We’re lucky to have a staff of seasoned science communicators who are very good at recognizing research that ‘has enough meat on the bone’ to be publicized. More importantly, they know which research isn’t ready for prime time.
My point is that you need to do your homework before blasting out a press release on your institution’s latest amazing discovery. Overselling a story hurts your reputation, the researcher’s reputation and tarnish your university’s reputation too. Aside from that, it certainly doesn’t help boost public trust in science. …oh, and it’s also likely that science bloggers (and tweeters) will administer a bit of public discipline.
What else can we be doing – right now – to boost public trust in science?
I look forward to hearing your ideas.
- Why science is so hard to believe? Joel Achenbach, The Washington Post, February 12, 2015 (also in National Geographic and The Guardian) (link)
- Why don’t people trust science? Tom Spears, Ottowa Citizen – February 21, 2015 (link)
- Our Partisan Brains: Exploring the psychology behind denying science. Erik Nisbet and R. Kelly Garret, The Conversation US, March 12, 2015 (link)
- Science Dilemma: Between public trust and social relevance. Hans Peter Peters, Euroscientist, February 25, 2015 (link)
- This is where distrust of science really comes from: It’s not just your politics. Chris Mooney, The Washington Post, March 2, 2015 (link)
- Shooting the Messenger: The erosion of trust in science and what to do about it. John Burrage, Australasian Physical & Eng. Sciences in Medicine, March 6, 2015 (link)
- A Matter of Trust Bernadette Keefe, Healthcare Leadership Blog, March 7, 2015 (link)
- Science communication in the age of polarization. Matthew Nisbet, Social Science Space, March 9, 2015 (link)
- Why science denial is about much more than corporate interests. Chris Mooney, The Washington Post, March 13, 2015 (link)
- Why we pick and choose which science to believe. PBS Newshour, February 18, 2015 (link)
- The Politics of Science: Political Values and the Production, Communication, and Reception of Scientific Knowledge (Special March 2015 issue) The ANNALS of the American Academy of Political and Social Science March 2015658: 6-15,doi:10.1177/0002716214559004 (link)
- Bessi A, Coletto M, Davidescu GA, Scala A, Caldarelli G, et al. (2015) Science vs Conspiracy: Collective Narratives in the Age of Misinformation. PLoS ONE 10(2):e0118093. doi:10.1371/journal.pone.0118093 (link)
- Public and Scientists’ Views on Science and Society Cary Funk and Lee Rainie, Pew Research Center January 29, 2015 (link)
- Don Grace Profile, Georgia Tech Research Institute Historical Archive (link)
- Change at the Top: The Grace Years, Georgia Tech Research Institute Historical Archive. (link)
- Georgia Tech Team Reports Flaw In Critical Experiment on Fusion, William J. Broad, The New York Times, April 14, 1989 (link)
- Donald Grace Interview, Georgia Tech Research Institute Historical Archive, December 7, 2006 (link)
(Editor’s Note (3/19): Bullets 2 and 3 at the top of this piece were edited for clarity, based on excellent crowdsourced feedback)