Daily routine: We wake up in the morning, we have breakfast and we open our smartphones to receive daily news about the world surrounding us. It would appear as innocent as it sounds, although the only problem is that probably most of us are not aware of the fact that information we learned from social media or search engines were filtered and delivered to us in a perfectly tailored form; each matching every individual’s preferences.
Ultimately, we unconsciously fell into the post-truth era where the objective facts are no longer accessible to everyone on the same level. The truth we acquire from the internet is the ultimate truth that we want to believe in, that comforts our worldview, meet with our expectations but not the truth that may confront our beliefs. Hence, keeping us away from certain information and feedings, and tieing us only with familiar contents is what is known under the newly emerged term as the filter bubble.
The expression filter bubble was coined by an internet activist Eli Pariser who referred to the way news were distributed among users. According to Pariser, search results are personalized due to a particular algorithm that collects data, interprets them and later on tries to match products to individuals’ preferences based on his previous search history, location, clicking behavior, age, gender etc. Consequently, leading to the creation of information bubbles that assures people in their own beliefs and exposing them to less contrasting viewpoints. This might find many implications within the society due to non-transparency of the newsfeed on social media, thus how can it negatively affect the public opinion?
Tech giants such as Google, Facebook etc. all use invisible algorithms such as cookies, tracking engines in order to deliver the most fitted contents that users might likely be interested in. However, the pros of using such method might be eliminating irrelevant contents from our newsfeed but on the other hand the danger of using such technology can greatly influence our political decision, which is in fact psychologically harmful, for what we are constantly kept in ‘ideological frames’ – within our narrow community of common interest that as a consequence leads to confirmation bias (echo chamber).
As an example, Eli Pariser experimented on his friends asking them to look for BP (British Petroleum) in a Google search. In effect, one of his friends received advertisements about its investments and the other one information about the oil spill in the Gulf of Mexico. Same results he received when he was searching for a certain political party. He became more exposed to his supporting party rather than the opposing one. He was not given many alternative matters, thus it may trigger political polarization or in the worst case lead to political extremism. This phenomenon shows how users are vulnerable to such computer-designed manipulations, auto-propaganda and how powerful companies can easily shape our political views as well as impact our decisions. Moreover, looking widely at the given problem it might be also a major threat to democracy.
“Democracy requires citizens to see things from one another’s point of view, but instead we’re more and more enclosed in our own bubbles. Democracy requires a reliance on shared facts; instead, we’re being offered parallel but separate universes… Personalization filters serve a kind of invisible autopropaganda, indoctrinating us with our own ideas, amplifying our desire for things that are familiar and leaving us oblivious to the dangers lurking in the dark territory of the unknown.” – Eli Paliser
The first thing we see after logging-in to Facebook is the newsfeed, which is an array aggregating recent activities of people in our friends. Each action they make, an event is called edge. It can be sharing, commenting or liking someone else’s status, sharing or tagging a photo etc. Multiplying these activities by several hundred friends in our circle and adding all the tracked fan pages the image of mass information in our newsfeed can be overwhelming. For this reason, Facebook introduced the algorithm that filters irrelevant contents and prioritizes edges, the system which was later named EdgeRank. However, the news stream is designed differently for everyone.
The main purpose of the EdgeRank is to sell our personal data collection to the major corporation in order to target us with particular advertisements. Most of us probably experienced a weird impression that Facebook ads know our shopping list so well, the products we recently bought keep popping up everywhere we surf across websites. Targeted ads can help us find what we need but in exchange for our private matters.
“Personalization is based on a bargain. In exchange for the service of filtering, you hand large companies an enormous amount of data about your daily life — much of which you might not trust your friends with.”
The privacy policies have changed a great deal since the early days when people were still worried about strangers finding out about their persona. The days of being anonymous on the internet have gone for the sake of advertising revenue. Agreeing to various websites privacy policies we became ‘a company’s own product’.
The faster we surf across the surface of the Web—the more links we click and pages we view—the more opportunities Google gains to collect information about us and to feed us advertisements. Its advertising system, moreover, is explicitly designed to figure out which messages are most likely to grab our attention and then to place those messages in our field of view. Every click we make on the Web marks a break in our concentration, a bottom-up disruption of our attention—and it’s in Google’s economic interest to make sure we click as often as possible.
In the attention economy, the filter bubble plays a huge role in extending the time we spent online, thus maximizing corporations’ revenue, constantly feeding us with products ready to purchase. Furthermore, social media has become a vast marketing platform and a gatekeeper that aggressively encourages people to buy more than they actually need. It is a serious psychological brainwash, for people who are not given the variety of choices of what can pop-up in their news-feed.
What can we do to minimize the negative impact of content filtering algorithms and thus reduce the harmfulness of the bubble filter?
Change the option of filtering the displayed events in the news-feed with “most interesting” to “latest”. Then the layout of the board will adopt a chronologically reverse stream of news, instead of prioritizing the ones most readily “lobbied”, commented and shared. It gives the impression of a greater order and a simpler criterion of content segregation.
Select the option “receive notifications” for the most valuable sites or people who we care about the most. Then you can be sure that the information provided by a given person or organization will be obligatorily displayed on our news table – regardless of the mechanism of content filtering.
You can sort and classify pages and friends according to their importance to us by assigning special expiry labels and creating personalized tabs for the pages we follow. This is a form of additional, controlled content filters.
To conclude, it is extremely important to stay conscious while browsing the internet and from the perspective of society, the key to reducing the negative impact of the filter bubble is transparency. According to Holly Green, “Forbes” journalist, it is necessary to have a more public awareness of how the algorithms of such websites like Google and Facebook work. At the moment, only a handful of people are aware of the existence of such mechanisms, not to mention the knowledge of the meanders of their functioning and related implications. By educating in this topic through the school and the media, perhaps we, users, will start to notice the relationship between the simple act of “giving like” and how much information reaches us, and how much does not. And perhaps even these simple activities we will begin to perform in a more reflective way, not forgetting the “bubble” in which we can easily be closed. And which is hard to see on a daily basis.
By Małgorzata Borkowska