Journalists tend to underestimate – not overstate – scientific findings, study findings

Science journalism has seen a resurgence during the pandemic. Every day seems to bring a new study, a new article, a new discovery for readers to break down and interpret.

But to what extent do journalists communicate these findings?

A study from the University of Michigan found that journalists tend to underestimate claims in scientific articles. As part of their research on how certainty is expressed in science communication, School of Information Ph.D. student Jiaxin Pei and assistant professor David Jurgens compared hundreds of thousands of article summaries with corresponding press articles reporting the findings of these articles.

“Findings presented in science news are actually less than the certainty of the same scientific findings presented in paper excerpts,” Pei said.

Pei and Jurgens found that journalists can be very cautious in reporting science, sometimes downplaying the certainty of findings. Their research contradicts claims that journalists exaggerate scientific findings.

To reach their conclusions, Pei, Jurgens and a team of human annotators calculated levels of certainty in scientific summaries and news articles taken from Altmetric, which tracks news that mentions scientific articles. They then built a computer model capable of reproducing these calculations, allowing them to analyze hundreds of thousands of articles and documents.

These discrepancies became particularly evident when they analyzed different aspects of certainty, such as ‘number’. For example, their findings, along with previous research, indicate that journalists may be replacing specific numbers found in scientific articles with language like “roughly” to make their writing more accessible.

Although they don’t have definitive answers as to why journalists underestimate scientific findings, Jurgens speculated that one reason could be that journalists think it’s best to err on the side of science. of caution. He notes that the job of journalists can be difficult. They must translate scientific work in such a way that it is understandable to a general audience.

While some think overstating scientific findings is worse than understating them, Jurgens said the latter can also have negative effects. He cited COVID-19 vaccine reports as an example.

“Scientists are fairly certain that vaccines are safe,” Jurgens said. “But I think raising the uncertainty about it could cause people to be less vaccinated or maybe not seek health care. In this case, in the context of the pandemic, it could mean loss of life, which is a pretty serious outcome. »

Pei and Jurgens also examined how the “journal impact factor” – their proxy for measuring the quality of science – affects how journalists present scientific findings. They found that the journal from which a study originated did not seem to influence how journalists described scientific uncertainty.

That can be a problem, Pei said, because higher-impact journals have a more rigorous review process. Knowing the prestige or reliability of the journal in which a scientific article is published could be useful information for readers.

For journalists who want to improve the way they describe scientific findings, Pei advises talking to the scientists behind the study they’re trying to cover. Jurgens noted, however, that scientists can also be “really poor communicators.”

“It’s an open question,” Jurgens said. “How can we effectively communicate this in an accessible way?”

When asked how certain they were of their own results, Jurgens and Pei said they were “fairly certain.” The model they built produced levels of certainty very similar to those calculated by the annotators, and their analysis included hundreds of thousands of data points. Their article was published in the Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing.

Pei and Jurgens noted, however, that they could always use more data and that their study did not look at other areas where people might perceive exaggeration in the news, such as headlines.

The next step in their research is to talk to journalists and figure out what tools they could use to improve their reporting. They brainstormed ways to help journalists translate the work of scientists for the general public.

One step that Pei and Jurgens have already taken is publishing coded which allows journalists and scientists to calculate levels of certainty in their writing.

“There are a lot of open questions in this (natural language processing) area,” Pei said. “With more effort in this area, we will be able to provide tools and systems for journalists to cover science.”

Comments are closed.