Ironically, doing research is the best way to be right. What people want is to feel right without having to think very hard. Feelings don’t really require energy in the same way that thinking does.
More than just research is needed and that's what many miss. One must be able to reliably evaluate the quality of evidence to sort fact from baloney. Doing so requires critical thinking, the ability to be able to poke holes in theories regardless of whether you like them or not, and the willingness to be wrong and, above all else, the mental flexibility to update your knowledge when proven so. Not everyone is able to do that.
I am used to being wrong a lot so it comes naturally lol.
Plus the methodology. There’s an idea of actively seeking out research contrary to one’s hypothesis, this helps circumvent the confirmation bias of only looking for things that support a hypothesis and ignoring anything contradictory. It can be healthy to find and consider dissenting opinions.
Another fundamental issue is people using different meanings for similar words. Someone with a strong understanding of scientific method will say things like “I believe” or “studies show”, while someone else will say things like “This is” or “we know”. Colloquially the latter is stronger language conveying more confidence, but the former is more likely to be evidence based. “Theory” is used colloquially the way a scientist would use “hypothesis”. People will say “I have a theory”, that’s only a few sentences and doesn’t make any reliable predictions, the put down an actual theory backed by years of supporting evidence and peer review as “just a theory”.
Feelings are SUPER important to humans because they’re a huge efficiency boost. We take everything we’ve ever learned in our lives and crunch it down into a feeling for how the world works. Then we make the vast majority of our decisions by using that “gut feeling”. Can you imagine how ridiculously inefficient it would be to have to analyze every new scenario you come across?
The big problem today is that people lean in too hard on that idea and assume that because their feelings are right most of the time, feelings must be equivalent to truth.
In other words, shortcuts and biases really just trade accuracy for speed.
Those many cognitive biases we succumb to may be great for scenarios faced by hominids a hundred thousand years ago or more. But for sussing out truth and evaluating evidence, they're straight caca.
The problem arises from the fact that the internet in particular incentivizes attracting attention above all other things and there's no incentive for being correct, nuanced or well-researched. Combine that with the fact that people like to be right about things and doubly so when everyone else is wrong about it and you create a world where conspiracy, woo and other bullshit is actually an industry. I feel like that's part that always gets lost in these discussions: people are making money from this.