Logical Biases: Preserving Fact from Perception

Today, we’re going to be talking about biases our brains have when processing information. This is important to know about because of the political and social environment we’re currently in. We have to know why some people think the way they do and how they manage to maintain irrational beliefs despite overwhelming evidence that their beliefs run contrary to facts. In fact, these logical shortcomings are so prevalent that even scientific professionals have to guard against them. However, when ordinary people fall victim to them, it’s much harder for us to self-diagnose and control for these subconscious biases and it often ends up in misunderstanding and misinformation. I tell you, our subconcious’s fear of cognitive dissonance is something to be feared (and rightly so).

However, it is not only an individual’s fault for falling into the trap but what we’re being exposed to in the media, in the social circles we’re part of and what authority figures tell us to think can also perpetuate these widening ripples of falsehood.

1. Confirmation Bias

In science, when they conduct experiments, researchers try not to let their own predictions on the outcome of the experiment affect the actual outcome of said experiment. When you seek out information that specifically confirms your own beliefs and pass over or completely ignore information that goes against your beliefs, that bias is called confirmation bias. This clip from a CH video explains this quite well and it also helps explain the vaccine = autism mindset some people have nowadays.

For researchers, they attempt to avoid this bias by either developing an objective way of measuring results or by performing blind or double-blind studies where the participants of experiments and the researchers that are directly observing the results of an experiment don’t know what sort of result is expected and therefore doesn’t skew the actual results of the experiment.

Then, there have been researchers who knowingly skewed the results of an experiment to confirm a result beneficial to their agenda. For example, back when obesity first showed its advance on modern society in the form of heart disease, a theretofore uncommon condition, people were trying to find something to blame. Fat became the victim while the sugar industry got a free pass. When people found out, a man by the name of Ancel Keys got most of the blame.

Here’s a (slightly biased) video about the whole incident…

 

…. and a more objective review of the whole affair outlined here from a fellow denizen of WordPress:

The Truth:

  • Ancel Keys did not drop any countries from the Seven Countries Study. His most famous graph—the first one up above—is from a different paper he presented at a World Health Organization (WHO) conference in 1955. The Seven Countries Study didn’t even launch until 1958, and entailed much more than just plopping numbers into a pretty curve. (That said, the Seven Countries Study had plenty of problems too; some are mentioned on this site.)
  • Contrary to popular belief, the cherry-picked graph didn’t convince everyone that fat was evil. In fact, Keys was pretty much ridiculed for the weakness of his fat/heart disease theory by other scientists at the WHO meeting, and whenever his graph was cited in medical journals later on, it was usually paired with some criticism. Although Keys’ work definitely shaped our current beliefs about fat, this graph didn’t exactly take the world by storm. (More on this later.)
  • When all 22 countries were analyzed, the association between fat and heart disease did not go away. It actually remained statistically significant (meaning it probably wasn’t due to chance). And to make matters worse, the paper frequently cited as a “rebuttal” to Keys shows pretty clearly that animal protein had an even stronger association with heart disease than total fat did. The China Study was right all along! Time to go vegan, you guys. (Just kidding. But this part is the most interesting of all, and we’ll examine it in excruciating depth in a moment.)

Although some of his saga has been misconstrued, Keys was still far from perfect—and his eventual role in demonizing saturated fats (while glorifying polyunsaturated fats) has led us down an unfortunate road.

Click here to see the whole article. (Kudos to the author for such thorough research.)

The fact that Keys’ own bias could have gotten in the way of objective research is something that everyone can learn from and avoid. So just be careful to know the whole story when looking at an issue or an event and don’t surround yourself with only sources or people who agree with you.

Here’s a good source to learn more about this bias.

2. False Consensus Effect

You know a person who is clearly wrong but is so sure that they’re right that they assume that everyone else thinks they’re right too? This is called the false consensus effect. This is especially crucial if you’re in a position of power. A person with decision-making powers over other people may make decisions for other people and assume that their subordinates (or constituents, for that matter) agree with them. This makes people less likely to actually seek out the opinion of others and it also leads them to create a bubble of “yes” people around them, furthering the delusion.

See here to learn more about this effect.

The false consensus effect is also tied to another logical fallacy which we will expand on next…

3. Availability Heuristic

This is our brain’s tendency to look for information that is easiest to get. This is why news and other media outlets aren’t the most reliable sources when it comes to forming conclusions about the world. Since what the news covers is limited and focuses on certain stories by nature, what we do hear about will seem bigger and more prevalent than the stories that we don’t hear about. For example, shark attacks are often sensationalised and when it does happen, the news outlets report on it. However, as the popular comparison goes, more people die from incidents relating to vending machines than shark attacks (see here for far more likelier ways you’ll die). But people are still more paranoid about being attacked by sharks than dying from vending machines. This is because there is more information and instances available to draw upon since you were more exposed to it.

This is especially important when it comes to making decisions and other social phenomena like mass hysteria. This is also why propaganda works. When given lots of examples to hate and fear the Japanese and the Chinese (each at different times in US history), the public ate it up and adopted the hate with posters like this:

US_propaganda_Japanese_enemy

15-exploding-brain-meme.w710.h473.2x

images.jpg
Titled The Yellow Terror in all his Glory

And this was before television became popular. So… you can imagine the impact of television now.

Then, to tie it back to the false consensus effect, when people make a conclusion based only on information they’re exposed to and are unaware of any other information, then it lends itself to false confidence in their beliefs and when they expouse* these beliefs to a group of people who are less sure, this belief then spreads into the misinformation epidemic we face today.

For a full list of all biases, see here.

Well, that was all for today. Don’t forget to like and follow! We’d love to hear what you think of this topic.

This is Lieutenant out.

*Expouse is a word coined by me. The official definition is:

Ex•pouse /ik•spous/

Verb

  • (usually of a political or religious belief) to spew with great conviction
Advertisements