That’s my story and I’m sticking to it

Members, remember to log in to view this post.

Which came first: the question or the answer? That may seem a little ridiculous, somewhat like that old “chicken or egg” question. But for those of us who spent our careers conducting fish, wildlife, and natural resources research, it’s a common question. Lots of people have answers they like and they use them for everything, no matter what the question may be.
As a research supervisor I heard the following statement all too often: “I want to conduct some research to prove that …” The researchers knew the answer they wanted. Then it was just a matter of designing experiments to “prove” their answer.
The problem with this approach is that honest research is conducted to find out, not to prove a pre-existing belief. This problem is not restricted to research. We are exposed to it all the time in politics, in religion, in environmentalism, and in most subjects that have the suffix “ism.” People have beliefs, often strong beliefs, which provide the answers they want. So they make the questions fit the answers; or they have the same answer, no matter what question has been asked. My older brother was a remarkable practitioner in the art of making all questions fit his political and religious beliefs. No matter what question was posed, the answer was predictable. I’ll spare OU readers any additional details.
In the world of resource management, predators are commonly blamed for “low” game populations, whether or not there is any evidence to support that answer. Mismanagement by state wildlife managers is frequently listed as a close second behind the answer — again, whether or not there is supporting evidence. Curiously, in fisheries management, predators are seldom blamed for low populations of preferred species. In the aquatic world, the culprit that gets blamed is usually an “alien invasive” — or mismanagement by state fisheries managers.
It’s a common problem: First come beliefs, then questions are interpreted and “evidence” carefully selected to support answers in line with those beliefs. If facts — real facts — do not support the belief-based answers, the facts must be wrong.
Our readers, viewers and listeners consider outdoor communicators to be authorities. Therefore, we have a responsibility to avoid the “answers come first” syndrome, the mentality of “It’s a good answer, it works for every question.”
We all have opinions and beliefs, even us “objective scientists.” The best, most accurate, repeatable scientific evidence must still be interpreted and summarized. Scientists learn to state just how certain they are about the results of their work and whether their work is repeatable by other scientists. If beliefs or opinions are expressed, they are stated as beliefs. But, how does the non-scientist communications specialist deal with these problems — without losing his audience?
One common statement that I avoid in my writing is “experts/scientists say.” Tell your audience who these experts and/or scientists are and quote them exactly. Look for numbers, but not just any numbers. If someone tells me that Olmsted County, Minn., (my boyhood home) has a whitetail deer population of 7,892 deer, my B.S. detector blares “red alert.” No one can estimate a wildlife population that exactly. I call such numbers “pseudo-quantification.” Statistics always have confidence limits, numbers that show “plus or minus” limits.
Another criterion for separating scientifically valid conclusions from beliefs and opinions is repeatability. Is the answer just the belief of one person, or have other investigators studied the question, found the same results, and interpreted the results in such ways that they arrived at the same answer?
Is there a different answer or conclusion that could be drawn from the same data? Communicators must be careful with this problem. Conspiracy believers can always find different answers. Be very suspicious of answers where blame is assigned to “THEM.”
One of the most difficult problems for non-scientists, and even many scientists, is detecting research that was designed to obtain a specific answer. I reviewed a book recently that dealt with the question of whether fish feel pain. Many studies were conducted and others cited, but several of my “crusty, old colleagues” and I noted that the reactions used to prove pain in fishes were reactions that fish typically show when they are avoiding potentially harmful situations, whether or not actual injury has occurred. The researchers were convinced in advance that fish feel pain and designed their studies in ways that showed exactly what they wanted. All alternative explanations were ignored.
So what’s an outdoor communicator to do? Recognize your own beliefs and opinions.
If your opinions and beliefs are not supported by valid evidence, admit it. Tell your audience that this is an opinion, or a belief. Always remember that “belief can explain everything, but Mother Nature demands evidence.”♦

—A member since 2005, John Nickum is from Fountain Hills, Ariz., and is a writer editor and educator. Contact him at


Scroll to Top