Service of SURF
© 2025 SURF
Social networks and news outlets use recommender systems to distribute information and suggest news to their users. These algorithms are an attractive solution to deal with the massive amount of content on the web [6]. However, some organisations prioritise retention and maximisation of the number of access, which can be incompatible with values like the diversity of content and transparency. In recent years critics have warned of the dangers of algorithmic curation. The term filter bubbles, coined by the internet activist Eli Pariser [1], describes the outcome of pre-selected personalisation, where users are trapped in a bubble of similar contents. Pariser warns that it is not the user but the algorithm that curates and selects interesting topics to watch or read. Still, there is disagreement about the consequences for individuals and society. Research on the existence of filter bubbles is inconclusive. Fletcher in [5], claims that the term filter bubbles is an oversimplification of a much more complex system involving cognitive processes and social and technological interactions. And most of the empirical studies indicate that algorithmic recommendations have not locked large segments of the audience into bubbles [3] [6]. We built an agent-based simulation tool to study the dynamic and complex interplay between individual choices and social and technological interaction. The model includes different recommendation algorithms and a range of cognitive filters that can simulate different social network dynamics. The cognitive filters are based on the triple-filter bubble model [2]. The tool can be used to understand under which circumstances algorithmic filtering and social network dynamics affect users' innate opinions and which interventions on recommender systems can mitigate adverse side effects like the presence of filter bubbles. The resulting tool is an open-source interactive web interface, allowing the simulation with different parameters such as users' characteristics, social networks and recommender system settings (see Fig. 1). The ABM model, implemented in Python Mesa [4], allows users to visualise, compare and analyse the consequence of combining various factors. Experiment results are similar to the ones published in the Triple Filter Bubble paper [2]. The novelty is the option to use a real collaborative-filter recommendation system and a new metric to measure the distance between users' innate and final opinions. We observed that slight modifications in the recommendation system, exposing items within the boundaries of users' latitude of acceptance, could increase content diversity.References 1. Pariser, E.: The filter bubble: What the internet is hiding from you. Penguin, New York, NY (2011) 2. Geschke, D., Lorenz, J., Holtz, P.: The triple-filter bubble: Using agent-based modelling to test a meta-theoretical framework for the emergence of filter bubbles and echo chambers. British Journal of Social Psychology (2019), 58, 129–149 3. Möller, J., Trilling, D., Helberger, N. , and van Es, B.: Do Not Blame It on the Algorithm: An Empirical Assessment of Multiple Recommender Systems and Their Impact on Content Diversity. Information, Communication and Society 21, no. 7 (2018): 959–77 4. Mesa: Agent-based modeling in Python, https://mesa.readthedocs.io/. Last accessed 2 Sep 2022 5. Fletcher, R.: The truth behind filter bubbles: Bursting some myths. Digital News Report - Reuters Institute (2020). https://reutersinstitute.politics.ox.ac.uk/news/truth-behind-filter-bubblesbursting-some-myths. Last accessed 2 Sep 2022 6. Haim, M., Graefe, A, Brosius, H: Burst of the Filter Bubble?: Effects of Personalization on the Diversity of Google News. Digital Journalism 6, no. 3 (2018): 330–43.
MULTIFILE
Algorithmic curation is a helpful solution for the massive amount of content on the web. The term is used to describe how platforms automate the recommendation of content to users. News outlets, social networks and search engines widely use recommendation systems. Such automation has led to worries about selective exposure and its side effects. Echo chambers occur when we are over-exposed to the news we like or agree with, distorting our perception of reality (1). Filter bubbles arise where the information we dislike or disagree with is automatically filtered out – narrowing what we know (2). While the idea of Filter Bubbles makes logical sense, the magnitude of the "filter bubble effect", reducing diversity, has been questioned [3]. Most empirical studies indicate that algorithmic recommendations have not locked large audience segments into bubbles [4]. However, little attention has been paid to the interplay between technological, social, and cognitive filters. We proposed an Agent-based Modelling to simulate users' emergent behaviour and track their opinions when getting news from news outlets and social networks. The model aims to understand under which circumstances algorithmic filtering and social network dynamics affect users' innate opinions and which interventions can mitigate the effect. Agent-based models simulate the behaviour of multiple individual agents forming a larger society. The behaviour of the individual agents can be elementary, yet the population's behaviour can be much more than the sum of its parts. We have designed different scenarios to analyse the contributing factors to the emergence of filter bubbles. It includes different recommendation algorithms and social network dynamics. Cognitive filters are based on the Triple Filter Bubble model [5].References1.Richard Fletcher, 20202.Eli Pariser, 20123.Chitra & Musco, 20204. Möller et al., 20185. Daniel Geschke et al, 2018
Social networks and news outlets entrust content curation to specialised algorithms from the broad family of recommender systems. Companies attempt to increase engagement by connecting users with ideas they are more likely to agree with. Eli Pariser, the author of the term filter bubble, suggested that it might come as a price of narrowing users' outlook. Although empirical studies on algorithmic recommendation showed no reduction in diversity, these algorithms are still a source of concern due to the increased societal polarisation of opinions. Diversity has been widely discussed in the literature, but little attention has been paid to the dynamics of user opinions when influenced by algorithmic curation and social network interaction.This paper describes our empirical research using an Agent-based modelling (ABM) approach to simulate users' emergent behaviour and track their opinions when getting news from news outlets and social networks. We address under which circumstances algorithmic filtering and social network dynamics affect users' innate opinions and which interventions can mitigate the effect.The simulation confirmed that an environment curated by a recommender system did not reduce diversity. The same outcome was observed in a simple social network with items shared among users. However, opinions were less susceptible to change: The difference between users' current and innate opinions was lower than in an environment with users randomly selecting items. Finally, we propose a modification to the collaborative filtering algorithm by selecting items in the boundary of users' latitude of acceptance, increasing the chances to challenge users' opinions.