Cyberbalkanization

Cyberbalkanization

In this article, I will try to go through a topic that has been gathering a lot of attention in the last years, and that personally fascinates me: the study of Internet echo chambers, and how information is spread through them. A study by the Anti-Defamation League focused on the instant messaging app Telegram as a source and safe haven for echo chambers.

Telegram, a cloud-based online social networking, and messaging app count more than 200 active users, and hosts around 221 thousand between groups and channels. Some are private, some public and some require a digital “key” to access, rendering them technically invisible. The contents on these channels can vary enormously; going from channels dedicated to job seeking to those who focus on gardening, and anything in between. The privacy that Telegram provides to its users is almost unique if compared to other messaging apps such as Messenger or Whatsapp.

Many have argued that the lack of control and total freedom that users enjoy on these platforms has been creating monsters. Correlation does not mean causation, yet, in such environment, violent extremism of all kind, white, alt-right and other groups, have found a hospitable place where to share and confront their ideas with like-minded individuals. Many of the white supremacist channels count thousands of subscribers and they are not subtle about their ideals, it is very easy to find them and even more surprisingly easy to find content that, while violating Telegram’s terms of use, is still widely accessible and shareable. Videos depicting mass murders, violent extremist propaganda and mass shootings are easy to come through, even after having been censored on other similar platforms.

These findings bring out some questions, how common is the “publicity” of these toxic and dangerous rhetoric? ADL’s studies of online behavior have suggested that behind every recorded act of violent extremism there are one or more individuals who have been through different violent extermination process journeys that, although relying on different ideologies, followed a very similar pattern. Anonymous internet communities and forums, such as the popular imageboard website 4chan and its many offsprings – to name a couple, 8chan and 8kun – have become a stage where the “actors” act in their play.

It is not a surprise that 8chan was taken down from the Clearnet in August 2019, after three different mass shooters — including the perpetrators of the mosque’s killings in Christchurch, New Zealand, and the synagogue shooting in Poway, California — had used the website to share their manifesto and announce the murders to “the community”. Both perpetrators deeply identified themselves with the ideologies shared among the users of these communities.

Research by the Network Contagion Research Institute has underlined how although the study of hate and violent extremisms is nothing new, the scale and outreach of the phenomenon during the Internet era has deeply impacted the efficiency of the traditional approaches when it comes to monitoring and measuring these trends. This report has helped to shed light on the ever-growing phenomenon of online echo chambers. NCRI came to these conclusions after founding the -so far- biggest quantitative study regarding the rise of alt-right and white supremacy on online platforms.

The culprits of Christchurch and 2019 Halle Synagogue shooting were active users of online forums such as 8chan and Gab. Their actions have demonstrated how tight the relationship between propaganda and actual, real acts of terrors is. Online genocidal fantasies ignited real-life violent extremism, whose actors were then showcased as martyrs and patriots on the online communities where themselves got brainwashed. A vitriolic and vicious cycle which shows how deep the self-identification in these communities actually is. These extremists hide behind what they openly call “freedom of speech”, and there is evidence suggesting the unpredicted and unforeseen scale of the spread of online extremist media at a rate that would have been almost impossible to predict at the “dawn” of social media.

In 1996 MIT researchers Marshall van Alstyne and Erik Brynjalfsson warned about a potential dark side to our newly interconnected world: “Individuals empowered to screen out material according to their preferences may form virtual cliques – insulate themselves from opposing points of view thus reinforcing their biases”. People tend to engage most with information that flatters our ideological preconceptions, misinformation flourishes online because users tend to aggregate in communities based on interest, which leads to the reinforcement of confirmation bias, segregation, and polarisation.

Nobody is safe from propaganda; not me, not you, not my family members nor your favorite singer – we all seek online information that we feel will align with our values. It is much easier and much more pleasant to use our time to hang out and discuss with like-minded friends and acquaintances rather than to turn every evening at the bar into an ideological debate. Naturally, I am also guilty of that – I admit to having cut people out of my social sphere after our political or ideological values clashed one against the other.

But is it the right approach to conflict? or is it just an easy way to turn our heads away from ideas that may challenge what we see as an “established” reality? It does not have to be this way, does it? Echo chambers may feel familiar and comforting, but in the long term they end up locking us up in a sort of tribal divide. We need to work on ourselves first. Self-judgement is necessary as we need to learn how to pick and select our sources and communities. We can not cling and rely solely on something that makes us feel “comfortable” and chimes with our beliefs. Sound and unbiased evidence should be enough to, if not change, at least to influence our views.

Laura Ghiretti

Leave a Reply

Your email address will not be published. Required fields are marked *