Social media radically changed the mechanism by which we access information and form our opinions. How do we seek or avoid information and how do those decisions affect our behavior, especially when the news cycle—is dominated by the disintermediated diffusion of information alters the way information is consumed and reported on.
A multitude of factors affects information spreading on social media platforms. Online polarization, for instance, may foster misinformation spreading. Our attention span remains limited, and feed algorithms might limit our selection process by suggesting content similar to the ones we are usually exposed to.
Furthermore, people show a tendency to favor information adhering to their beliefs and join groups formed around a shared narrative, that is, echo chambers. We can broadly define echo chambers as environments in which the opinion, political leaning, or belief of users about a topic gets reinforced due to repeated interactions with peers or sources having similar tendencies and attitudes. Selective exposure and confirmation bias (i.e., the tendency to seek information adhering to preexisting opinions) may explain the emergence of echo chambers on social media.
According to group polarization theory, an echo chamber can act as a mechanism to reinforce an existing opinion within a group and, as a result, move the entire group toward more extreme positions. Echo chambers have been shown to exist in various forms of online media such as blogs, forums, and social media platforms and apps. Some studies point out echo chambers as an emerging effect of human tendencies, such as selective exposure, contagion, and group polarization.
However, recently, the effects and the very existence of echo chambers have been questioned: Scientists from the Reuters Institute have recently published an overview in which they state that studies both in the UK and several other countries, including the highly polarised US, have found that most people have relatively diverse media diets, that those who rely on only one source typically converge on widely used sources with politically diverse audiences (such as commercial or public service broadcasters) and that only small minorities, often only a few percent, exclusively get news from partisan sources.
The forms of algorithmic selection offered by search engines, social media, and other digital platforms generally lead to slightly more diverse news use – the opposite of what the “filter bubble” hypothesis posits – but that self-selection, primarily among a small minority of highly partisan individuals, can lead people to opt in to echo chambers, even as the vast majority do not.
There is limited research outside the United States systematically examining the possible role of news and media use in contributing to various kinds of polarisation and the work done does not always find the same patterns as those identified in the US. In the specific context of the United States where there is more research, it seems that exposure to like-minded political content can potentially polarise people or strengthen the attitudes of people with existing partisan attitudes and that cross-cutting exposure can potentially do the same for political partisans.
Fortunately, people who access information via Google search or Facebook would also see a wider selection than the proponents of the filter bubble, the reasons why someone retreats to echo chambers are to be found in the personal dispositions of the respective users as of the 3 factors, we have explained previously here.
In summary, this aggregation of users in homophilic clusters dominates the online interactions in the social media sphere. In spite of the small percentage of echo chambers found in the study, their negative effects of reenforcing a maximum exposure to only conspiracy theories have been extremely harmful to our societies especially in times of the recent pandemic from vaccine hesitancy to the endless protests against the Covid measures.
Read the full Reuters Institute review and findings here
Wir geben unser Bestmögliches, Ihnen die neuesten Techniken und Werkzeuge zur Faktenüberprüfung zur Verfügung zu stellen. Spenden Sie jedes Mal, wenn Sie Desinformationen lesen, und das Geld wird für die Bezahlung einer Anzeige zur Überprüfung von Fakten verwendet!