“Sidewalk express – Internet point” by nicolasnova is licensed under CC BY 2.0
Since its creation, social media has had a profound effect on the views we hold. Social media is an environment where people of all different nations, races, ideologies, and religions can meet up and talk to one another, with each sharing their own perspective; at least, this is how it should be. However, in recent years social media has had more of a negative effect on our relationships with those different from us than a positive one. This is due in part to the usage of algorithms on social media websites like Facebook and Twitter. In this paper we will discuss how social media algorithms are affected by our political views, and in turn algorithms then affect our political views. This cycle is a vicious one, and it has dangerous repercussions for our democracy.
Social media websites use algorithms to determine what is shown first to users. This is typically done by putting what the algorithm thinks the user is most likely to interact with at the top of their feed. Algorithms also base their selections off of how other users have interacted with an article, with news articles that have a large amount of likes or shares/retweets being more likely to appear than news articles with fewer. While the exact algorithms that Twitter and Facebook use are kept secret, they are both similar in that they make the most popular content the easiest to access. The more popular articles are then catered to the user based upon who the user is following, what the user’s followers are interacting with, and what the user themselves is interreacting with. This process is known as popular sorting, and it is the main purpose of using an algorithm with social media. Popular sorting is not perfect, and it can very easily be abused to promote content that may or may not be factually correct, something that we saw during the 2016 election.
A prime example of social media algorithms having a possibly dangerous effect on our democracy was during the 2016 election. While researchers on the topic believe that the “fake news” (or “news sources that are not vetted for accuracy”) that was being promoted by websites like Facebook may not have had any noticeable effect on the outcome of the election, they do believe that fake news did have some impact on individual voting choices, enough so to be noticeable (Kurtzleben 2018). During the 2016 election, Facebook did little to prevent fake news from using its algorithm to ride to the top of many people’s feeds. It has been estimated that more than a quarter of Americans who could vote visited at least one news source that was not vetted for accuracy (these could have been in support of either Clinton or Trump) during the final weeks of the 2016 campaign (Kurtzleben 2018). Whether or not these visits had any effect on how these Americans decided to vote is unsure, however, constant exposure to fake news has been proven to increase the likelihood that the fake story is believed (Cannon et al. 2017). This issue is not helped by the fact that Facebook was essentially a breeding ground for fake news during the months prior to the election. In the months leading up to the big day, unvetted stories routinely outperformed real news stories in terms of interactions (Pennycook et al. 2017). When algorithms are taken advantage of to show false information over real information, people can have a hard time distinguishing what is true.
Another issue with algorithmic sorting by popularity is that it increases the number of partisan sources of information seen in a feed. Popular partisan content, that may not even be entirely correct, can easily be propelled to the top of feeds if enough people interact with it. Studies also show that the more an article or comment is “liked” the more likely it is to receive additional “likes” (Shmargard and Klar 2020). Partisan content is more likely to be interacted with than non-partisan content because of selective exposure. Selective exposure means that people are more likely to engage with content they are comfortable/identify with than they are to interact with content that is contrary to their beliefs. As a result, people who only engage with perspectives that align with their own, the in-party, are more likely to negatively view people who think differently, the out-party. This creates a scenario where partisans are only interacting with content they agree with, and then the algorithms social media websites use boost that content higher up in the feeds of the partisan’s followers. Now, this does not mean that social media is essentially an echo-chamber where each side sticks to their news and does not interact with the other side; in fact, social media allows for much more interactions between people with different opinions than had they not been on social media (Shmargard and Klar 2020). What this does mean is that the usage of popular sorting by social media websites increases political polarization, thereby making the interactions that in-parties have with out-parties much more likely to be negative than positive. Interactions with members of out-parties become hostile, making the likelihood that users interact with them positively much less likely. This creates a system where users are more likely to interact with sources that align with their views because it makes them comfortable, even if these sources are not always telling them the truth.
The use of popular sorting by social media algorithms has had many unintended consequences for our political environment. However, this does not mean that social media websites are necessarily a bad place to obtain news. Social media can be used as an excellent source for getting news, and it has become one of the largest providers of the news. According to research done on news usage across social media websites in 2020 by the Pew Research Center, approximately 71% of all Americans use social media as a source for news (Mitchell and Shearer 2021). This is up 3% since 2018, showing that social media usage for obtaining news is steadily increasing. Of the social media platforms used for obtaining news, Twitter has the largest portion of users who regularly get news off of the website, and Facebook is used by over a third of all Americans (not just Facebook members) as a regular source of news (Mitchell and Shearer 2021). Popular sorting makes news much easier to access, despite concerns over news sources that are not vetted for accuracy. Most Americans say that they view the news they see on social media as “largely inaccurate,” showing that most Americans do not blindly believe any story they find on their feed (Mitchell and Shearer 2021). However, as stated earlier prolonged exposure to fake news does increase the likelihood that fake news is believed. So, while social media is great for easily accessing news, more scrutiny needs to be placed on the algorithms that run them by the consumers of social media.
The effects that social media algorithms have on our political environment matter for a number of reasons. For starters, as stated earlier social media algorithms increase polarization through popular sorting. Popular sorting encourages individuals to be more selective of what information they are consuming, making them more likely to select partisan news sources over nonpartisan ones (Shmargard and Klar 2020). Polarization causes tension between opposing sides, which makes it much more difficult for the two sides to want to work together, or even interact with one another. If regular Americans refuse to interact with each other because of conflicting political views, then that makes it much more difficult for our representatives to want to work together. Social media platforms are used by millions of Americans, which makes partisan information difficult to avoid (especially if it is being pushed to the top of one’s feed). Polarization is a big issue today, and with increased access to our fellow Americans and their political opinions, the issue is only going to get worse. Another major effect of the algorithms employed by social media sites is how easy they make it for fake news to spread. As we saw in the 2016 presidential election, bad actors abused social media algorithms to push untrue, misleading news in an effort to promote one candidate over the other (Kurtzleben 2018). Whether or not this actually had any significant effect on the 2016 election is up for debate, however the ramifications are still felt today with countless inaccurate or untruthful news stories still being pushed to the public. As more and more Americans consume fake news as a part of their daily lives (whether intentional or not) the effects of entertaining fallacies will become more prominent, and it will be difficult to convince Americans that a source they trusted was wrong.
If social media platforms are going to fix the mess their algorithms are creating, then they need to start changing how their feeds operate. For starters, social media websites should reevaluate whether or not the risks posed by popular sorting outweigh the benefits of using this method. Another good change that could be made is the implementation of a more effective fact-checking system on social media websites. Fact-checking resources are becoming more popular on social media websites, but most of these do not entirely prevent the sharing of news sources that are not vetted for accuracy. Nonpartisan sources of information should be easily accessible to the public to help counteract the influence of partisan sources. Changing the default standard of sorting to a time-based one could also prove effective in reducing the issues caused by popular sorting. As the influence of social media grows, so too do the issues they present.
In conclusion, social media algorithms enable a dangerous cycle of polarization by promoting content that is false and misleading, as well as prioritizing partisan media over nonpartisan media. We discussed how these algorithms operate and how their methodology of popular sorting creates many problems. We viewed the effects of popular sorting on the 2016 presidential election, and also how fake news has continued to grow since. We also discussed the selective exposure that many Americans participate in, and how popular sorting makes it much easier for Americans to sink into strictly partisan news sources. We looked at how many Americans use social media as a news source, and in turn how large of an impact bad algorithms can have. Finally, we discussed how exactly our political environment is impacted by social media algorithms, and how we could potentially solve these issues. Social media, while it can be a great tool that can be used to keep its users informed about current events, it can also be used as a tool for mass misinformation and polarization. If the issues surrounding social media are to be solved, then a conversation needs to occur about the main issue: the algorithms.