Confirmation Bias gives us a feel-good feedback loop. To change the system, we have to challenge it.
Part of the evolutionary success of human beings has been based on the ability to recognize, learn, make changes, and grow. As much as we would like to consider ourselves as independent thinkers, the way that we form opinions has more to do with our tribal upbringing and environment than anything else. When discussing both confirmation bias and filter bubbles, we are dealing with a blend of ancient survival skills combined with the sophistication of the human mind.
The concept of confirmation bias assists us in the reinforcement of what we already know so that we can quickly make decisions. We have a preference for words, shapes, and sounds that are familiar, and the more we are exposed to them, the more we are willing to accept them. However, humans still operate on another level that is referred to as “motivated reasoning,” which is described as “an emotion-based decision-making phenomenon studied in cognitive science and social psychology. This term describes the role of motivation in cognitive processes such as decision-making and attitude change in a number of paradigms, including: Cognitive dissonance reduction.”
What much of this boils down to is held in the intricacies of how someone is raised, and whether they were encouraged to question the status quo or line up and adopt what is supplied to them. This ideology can be referred to as defensiveness and tribalism versus a different value set to root out the truth in a grounded method.
Confirmation bias goes beyond just accepting information that is fed to an individual, but also taps into the emotional bond that the data creates with preconceived ideas that are part of the center focus of the individual’s existence. There is almost a Pavlov’s dog response in the reward factor as more information that feeds into previously stated ideologies escalate the emotional aspect of pleasure. The problem with embracing confirmation bias is that it creates a non-realistic opinion based only on select information, without recognizing additional data, simply because it doesn’t match the belief system. The entire premise for this mindset is what is used for brainwashing and propaganda techniques, and we are seeing more evidence of this on a daily basis as people try to plow through to find what is or isn’t the truth.
A Scientific American article/ entitled Biases Make People Vulnerable to Misinformation Spread by Social Media — Researchers have developed tools to study the cognitive, societal and algorithmic biases that help fake news spread, cover the three types of biases that encourage tainted views:
Bias in the Brain
the use of information overload to max out the brain’s ability to separate fact from fiction, including bias shortcuts and cutoffs.
Bias in Society
connecting directly with peers so that the selection of friends influence the information that they see and adopt; creating an “echo chamber” that is ripe for manipulation.
Bias in Machine
the complex algorithms used to present what individuals view online in both search engines and social media platforms. Under the guise of “personalization,” the technologies hone the content down to be the most relevant and engaging while reinforcing pre-existing cognitive biases.
Google Refined the Filter Bubble
We have to remember that the concept of marketing has historically always been about convincing the public to be attracted to a product or service. This form of communication has been around since humanity first attempted to draw figures, and has been a consistent method used in advertising. However, this has always been a hit-or-miss approach; hoping that the message was sent to the demographics that would inevitably buy. Everything changed with the advent of the internet, and entire new divisions of marketing were developed to dig deep into the psyche of the average user.
Although many became quite the experts at using the internet, it was Google that took advantage of their power of information to distill the very essence of their search engines into individual filter bubbles. There is a dollar value associated with the data that they could sell so that a company could send that same marketing message to the exact individuals that had already expressed interest.
There is an additional level of power in escalating search engine results based on algorithms of previous searches so that only the pertinent information arises in the final output. The general public wasn’t aware of the fact that the screen displays were creating a confirmation bias and filter bubble because to the average person this was more “personalized.” Thus the news, data, and even products that appeared seemed to be designed just for them, and this seemed to avoid wasted scrolling time.
The heart of the description of a filter bubble comes down to providing content for consumption that is closely associated with personal preferences so that it creates a bubble chamber that presents a restricted view of the bigger picture.
The manipulation problem of filter bubbles is obvious. While many people don’t want to be told that they are not only being molded and managed, but they are doing so willingly. We live in a fast-paced, short-attention-span society that encourages the path of least resistance, and filter bubbles allow us to cut to the chase more quickly. The fact that we don’t see alternative concepts, viewpoints, or even services and products is of lesser value; and this is where the fine line arises between being controlled and controlling.
The Danger of Combining Confirmation Bias and Filter Bubbles
No matter what side of the aisle you are on, empowering others to feed information to you that doesn’t present the whole story is the bases for misinformation at an escalated scale. The difficulty with what is occurring in our social structure is that people have become accustomed to this methodology, and only a few refuse to adopt it without question. We have seen the fallout on some of these topics, from the ridiculous birther accusation against President Obama, the story that the girls abducted in Africa by Boko Haram was a made-up incident, and the Russian influence in the 2016 U.S. Presidential election and beyond.
As we have moved the evolution of misinformation forward, it has become the status quo to accept the data and not even question the source(s). We enter the realm of personality types to view those that accept the information verbatim versus those that choose to seek out sources and fact check. The sad status is that a majority of people buy-into the versions presented to them based on confirmation bias and filter bubbles, just because it’s easier and it feeds to their inner belief system.
The release of information that is blatantly false has been weaponized under the name of “fake news” by individuals whose sole purpose is to create the lies. The onslaught has been so great that it has formulated a state of cognitive dissonance, leaving many to give up and return to their bias and filter bubble comfort zones.
On a global scale, we seem to have lost our ability to be furious at being fed a constant diet of lies. There is a broad requirement to ask the tough questions, to devote the time to research for multiple sources, to find the heart of where the data is coming from, and to seek out precisely what is the “truth.”
Fake news has infiltrated every aspect of our existence, and many just read a headline without reading the information that we are passing on. In this day and age, we are each one of the publishers as we share the stories. The dangers that are involved in supporting a limited view are the consequences that can arise from confirmation bias wrapped inside of a filter bubble.
An article by BBC states: “One possibility is that we simply have a blind spot in our imagination for the ways the world could be different.”
This comment may seem like an easy explanation, but when dealing with the human psyche, it goes far deeper than that.
Can We Fight the Problem?
What began with a few voices expressing anger at the abuse of being manipulated by the very technologies that we depend upon, has increased to a crescendo. It took Congressional hearings involving high-ranking members of such notable companies such as Google and Facebook to bring the topics of fake news to the forefront. For those that were genuinely listening, there was a greater explanation of the methods used for confirmation bias and the filter bubbles that were freely offered to allow these forms of manipulation to occur.
In the same BBC article mentioned above they state:
The finding is good news for our faith in human nature. It isn’t that we don’t want to discover the truth, at least in the microcosm of reasoning tested in the experiment. All people needed was a strategy which helped them overcome the natural human short-sightedness to alternatives.
The moral for making better decisions is clear: wanting to be fair and objective alone isn’t enough. What’s needed are practical methods for correcting our limited reasoning — and a major limitation is our imagination for how else things might be. If we’re lucky, someone else will point out these alternatives, but if we’re on our own we can still take advantage of crutches for the mind like the “consider the opposite” strategy.”
In other words, as a species, we can be open to receiving additional information outside of our realm of understanding, but it will require a leader to create a method that we can believe in to take the reins and design a technology that can assist in bias-removal and the de-filtering process.