Online Communities of Hate: The New Global Terrorism
In a matter of a few short years, a small group of toxic individuals began campaigns to spread misinformation to such a degree that it has caused division and discord around the world. Through the use of manipulation tactics, these individuals have catered to everyone from top influencers to media events. They no longer remain hidden in the dark, but now feel empowered to enter the mainstream to sow conspiracy theories and propaganda. What began as a slow-moving ooze of lies has escalated to a tsunami that has overtaken conversations and philosophies. Resistance to this movement requires exposure and actions.
Our global society has an amazing tool and resource called the internet, but along the way, a few small groups of people mutated this marvel of technology and weaponized it through hate, racism, and bigotry. We see the results of misinformation and manipulation every day as individuals act on the constant flow of propaganda with hate-filled violence. It’s difficult for many of us to understand what goes on in the minds of those that are not only devoted to, but committed to an existence of hating others because of their color, religion, or heritage. We know that racism and hate groups have always been around, but it’s the internet that has allowed people with underlying motives to create and share their messages so that it expands worldwide. Like any socially unacceptable movements, hate communities have been mainly underground, but in more recent times, the radicalization has taken a sharp turn to allow the communities to catapult the individual lone wolf into terroristic actions.
Some common goals seem to be shared amidst many of the hate groups, and they follow the cult playbook. These often involved the requirement to meet in person. Their philosophies encompass the central concept that there were people of color, other cultures, religions, or even political beliefs that destroyed their country and that they couldn’t “take their country back” while sitting behind a keyboard. As their plans and discussions became more intense, various sites rejected the members; forcing them to seek out other shelters to share their hate.
Over time, hate groups began to hone and refine their messages, and there was a direct change in their landscape to attract and appeal to those that shared their views. The transition from “white supremacist/nationalist” were being made to feel as if they were the “victims” of attacks and needed to protect the “white state.” This twisted philosophy has been the central theme that has evolved to what we see today in a variety of hate communities.
Social Media : Profitable Hate
Once upon a time, Reddit offered the “geeks” of the world a completely open platform for anyone to talk and exchange information and ideas no matter how esoteric. Over time, Reddit became the perfect forum for individuals to show up, spew hateful rhetoric, and both encourage and prompt each other to be their very worst. The U.S. was not alone in the rise of white supremacist and nationalist ideologies online, as the same patterns were beginning to appear in some European countries. While there is a litany of reasons that people join these hate groups, the one ribbon of commonality seems to be that they feel persecuted because of their beliefs and the new platform of allowing and even encouraging them to speak out is a kind of vindication.
When it comes to hate, there doesn’t seem to be any bar low enough, and as the various sites that harbored these people grew, so did the horrors. Many of the social media sites were naively operating with the ideology of an open platform for connection. These platforms became widely successful, and yet the more nefarious conversations were ignored. It didn’t take long for the hate groups to latch onto social media as an easy method to find one another and transmit their messages. The open and free structure of social media sites then allowed the hate to go viral.
The exchange within social media groups as well as Reddit established ideologies with the ability for others to respond and encourage. As time progressed, each message of hate became increasingly more daring. Around the same time that social media hate groups were increasing there was an escalation of other sources that empowered the allure. It wasn’t until the Charlottesville, Virginia during a white supremacist and nationalist protest that ended in the murder of a young woman, Heather Hayer, that Reddit began to clean up their platform and to identify and remove the hate communities.
The Rise of 4chan and 8chan
4chan began as an online destination for a variety of the gamers and hackers that were banned from other sites to congregate and share the more socially unacceptable aspects of their belief systems while hiding their identities. Members could comment and encourage each other without fear of accountability. However, when 4chan wasn’t dark enough, 8chan appeared, and this remains one of the deepest and darkest areas outside of the dark web, where one can find the worst that humanity has to offer. From child pornography to ultraviolence, the white supremacists and nationalists found a home with like-minded individuals that could influence and brainwash.
8chan is one of the sites that both the New Zealand and Poway, California shooters used to justify and amplify their extremist ideologies. The individuals that start hate chatter act as a catalyst for those that feel as if they are misunderstood or alone. The volume of anti-Semitic, anti-Muslim, and general hate rhetoric is so high that it has the most inquiries by law enforcement and the government from a public website. 8chan embodies the perfect storm for those that harbor hate, but more importantly, it is a breeding ground for those that want to use the ploy of brainwashing to entice others. They use terms that the like-minded can relate to, slowly evolving until they are being manipulated under a banner of anonymity.
The people that are part of 8chan and hate groups also usually have other reinforcing influences. These can include watching ever more toxic YouTube videos, listening to far-right conspiracy talk radio such as Alex Jones and for younger audiences, the YouTuber, PewDiePie. Some are outright propaganda, where sources such as PewDiePie intertwine humor, gaming, and silly voices that blend racism and anti-Semitic phrases, including “death to all Jews.” Posing as entertainment, PewDiePie inspires racism and anti-semitism in a passive-aggressive manner.
Stochastic Terrorism and the Lone Wolf
When we think of terrorists, we visualize an orderly organization with a leader and orders that are carried out by underlings. Stochastic terrorism is a more abstract concept that involves internet influence of violence that is extrapolated over time. What begins with edgy off-colored jokes and stereotypes, leads to the next layers to attract viewers in what might seem benign behavior. As members add to the conversation, the overtones become harsher.
Stochastic terrorism taps into subtle and pervasive biases that are innate in many people. The casual use of terminologies that are racist and bigoted allows feelings to begin to move up into a comfort zone that can include online harassment and sharing their darkest feelings with those that are around them. Appealing to areas that the reader or viewer feels are within the acceptable range for discussion is called the “Overton window.” The ability to trap someone inside the parameters of the window of acceptability allows those in control to slowly move them into areas outside of that realm as part of the accepted dialogue or “moving the window”.
As an individual adopts each level of stochastic terrorism it becomes easier to move them to the next, more aggressive level. While a majority will just join in, there are those that are the most vulnerable and suffer from mental instability that will take the words, terms, and affirmations as personal instructions. Now is where the “lone wolf” comes into play, and they take their role very seriously. They view themselves as a hero, whose sole goal is to rid the world of those that are destroying the white race and circumvent any and all that are causing the problems of the world.
The psychology of those that are participating are often lone individuals that feel disenfranchised, and they surround themselves with online spaces that normalize the edgy and abusive messages in irony or humor. Don’t underestimate this process. They include terms and expressions that only their own “insiders” recognized and yet are too vague to be blatantly racist or bigoted.
The whole point of stochastic terrorists is to intentionally cause acts of violence while giving them the ability of deniability. Members are desensitized to racist chants and memes while they accept and continue to absorb all of the negativity. Each step takes those that buy into this process deeper into the dark hole. The messages and actions can be further reaffirmed if those in power espouse rhetoric or action that includes measures of taking control away from the cultures, people of color, or religions that infringes on their personal rights.
Both the New Zealand and California shooter were portrayed as “lone wolves”, but in many ways, they were part of a global terrorist organization connected by toxic ideology through an internet connection.
Each made use of their participation in a variety of online hate communities and groups to create their “manifestos,” which unto themselves were part of their stochastic terrorism driven actions. The brainwashing and propaganda that they received were gathered over time, and their responses of shooting innocent individuals could not be blamed on any specific leader.
The actions of the new alt-right and those that are purveyors of hate, racism, and bigotry have redesigned their look and appeal so that they are considered to be “alt-lite”. With each progression, it elevates the viewer/reader further up the ladder of racism and bigotry so that they are accepting of all of the ideologies. In essence, this ploy is a rebrand of the hate that accomplishes the same end goal. It will take continual efforts to pull the strands of misinformation apart and expose the underlying malicious narratives and intentions of these actors.
We have entered an age when the technologies that we all use must offer additional layers of identification so that we can help to ensure that we win the battle against misinformation and its damaging and violent effects on society. People have an innate desire to receive accuracy and are becoming outraged at the idea that others are manipulating them for profit and personal agendas.
At Blackbird.AI we believe that everyone needs is a way to immediately identify what they are viewing and where it is originally sourced. We have a sophisticated AI and human-driven platform that offer “Credibility Labels” that offers validated proof for the topics, images, sites, and sources that harbor suspicious misinformation including web links, memes and other digital content. Our mission is to make sure that the individual can make personal choices on what they want to see and know, and we aim to harm the public with the tools to help in those decisions. We work to “pre-bunk” and expose those who have ulterior motives and we aim to provide the same tools to governments, social media platforms and thinking citizens of the world
We do not accept censorship as the solution as that is an alternate slippery slope that will slowly eat away at the freedoms of choice that is the basic foundation of Democracy. As our “Credibility Labels” are allocated, those that are purveyors of fake news, deep fake AI, false and misleading memes, and websites designed to harm will lose their ability to drive the narratives that they so eagerly desire. Each individual needs to have the ability to be in control and stop the manipulation.
This Is Why We Fight,
We are fighting in the war against misinformation to create a more empowered critical thinking society.
To find out more about our team and the Blackbird.AI Mission, visit us at www.blackbird.ai