Understanding numbers key to crafting accurate stories
BY TAYLOR WYLLIE
We’ve all seen it: a story that seems too good — or too bizarre — to be true.
Red Wine: Cure for lung cancer? Pessimists live longer lives, according to a new study. Eat more and weigh less?
These stories are classic cases of misrepresented statistics.
Statistics are an essential tool for outdoor communicators. Numbers and studies give veracity to news, and searching databases can be an excellent way to find story ideas.
But using statistics and attempting to comprehend scientific articles when you aren’t trained as a scientist can be hard. Then regurgitating the information in an informative and easy-to-understand manner adds to the challenge.
But you can do it. Here’s how.
Say you stumble upon a report that claims if bison in the Greater Yellowstone Ecosystem are fed jelly beans every day in July, they have a higher chance of surviving the winter.
Yes, this is a ridiculous example. But you should treat every report you find as though it is as ludicrous as my claim. It’ll be easier to find the leaps in logic.
There are two ways to go about reporting this story. You can write an article that says a group of scientists are studying the effects of jelly beans on the ecosystem’s bison. You’re not insinuating anything. It’s less a story about the bison and more about the jelly bean scientists.
Or you can report on the claim itself, think: Bison survive winters better on a jelly bean diet, new study says.
The latter will likely get you more clicks, but it’s also the story that can get you into trouble if you don’t understand statistics.
Let’s say the scientists fed 100 bison jelly beans every day throughout July and tracked the animals throughout the winter. The scientists found 92 out of the 100 bison alive in March, while the average survival rate for the area bison population is 90 percent.
First, look at your source. Is the report sponsored by a jelly bean company? Is it in a peer-reviewed journal?
Then think about the scope of the study and the results. How large is the sample size, or number of animals in the study? How long did the scientists test their theory? What do the results really mean?
It’s easy for researchers to force results of statistical significance, or make it seem like the outcome couldn’t have happened by chance, from increasing the sample size to decreasing variability in the sample.
You need to remember — statistical significance is different than practical significance. The latter concentrates on the studies’ applicability in the real world.
Maybe 92 of 100 bison survive a Yellowstone winter with the jelly bean diet, and that is statistically significant. But think about it. Do two bison actually make a difference here? Perhaps a group of tourists put one bison outside the study in the back of their car and park staff had to kill it. Maybe another bison was incredibly old and had a heart attack before winter even began. There are reasons outside diet that bison live and die, and when the numbers don’t increase or decrease enough to make you think something is going on, then probably nothing is going on.
Not to mention the study only took place over a year. Almost all natural science studies need to be studied over long periods of time because of the variability in the natural world. One year does not make a pattern. It could be an anomaly; maybe it was a rainy summer and warm winter so the bison had easy access to food, or maybe it was the opposite.
But say the study took place over 10 years and 98 or 99 of the 100 bison survived on the jelly bean diet every year. There isn’t any magic number for journalists that tells you when a study becomes practically significant, but again think about these results. It’s tough to account for eight or nine bison every year for 10 years. Something must be going on.
Now is the time to check if other studies have found the same results. If other studies haven’t replicated the findings, make sure to mention that in your article. It’s one of many things you need to consider as you start to write.
In this example, you probably don’t need help in making your story interesting. But if you’re writing a piece about the dissolved oxygen levels in a local stream, or the stress of warmer temperatures on a specific type of fungus, here are a few tips:
Talk about the significance
Even if the data appears to be boring, the context surrounding it may be anything but. Don’t focus on the numbers, focus on what the numbers mean. Take the bison example. Perhaps Yellowstone has outlawed bringing jelly beans into the park, as they don’t want to increase the longevity of park bison. That is a lot more interesting than listing numbers.
Be careful not to impose unfounded reasons for the study’s findings. We all know the phrase: correlation does not imply causation. Because there is an A and B, doesn’t mean A causes B. There could be a third factor at play.
Explain the process
Write conversationally and give plenty of examples. People are better at understanding stories than definitions and numbers. If it fits, you might even consider crafting a first-person narrative in the story or as a sidebar so you can explain how you analyzed the data set yourself, acting as a surrogate to the reader.
It’s easier to read numbers when they’re separated from blocks of text. Plus infographics allow for fun, interactive designs on pages that might otherwise be heavy on text.
While taking a step back and working with statistics may seem tedious and even overwhelming, in the end you’ll create a far stronger and more accurate article. And if you’re still confused? It’s okay to reach out and ask for help. ♦
— Taylor Wyllie is an OWAA intern and student at the University of Montana, pursuing a degree in both journalism and environmental studies. She’s reported and edited for the independent student newspaper, The Montana Kaimin, for two years and her work has appeared on Montana PBS, Montana Public Radio and in the Missoulian.