3.3 Algorithms on Social Media Feeds

In the same way search results are altered based on a combination of personal factors (such as search history and account preferences), social media newsfeeds are also manipulated to show users what they ‘want’ to see. A social newsfeed is a constantly updating list of stories posted by or about a person’s community within each specific platform (on Facebook: friends; on Instagram: those the individual follows). It prioritizes what a person will find interesting and in doing so filters out other content. The algorithm impacts whose content each user sees and when they will see it. Sometimes a post from Thursday might appear on your Instagram feed on a Sunday, this is a part of the algorithm’s design and is impacted by each user’s behavior.


An image featuring Facebook's rolling newsfeeds on a laptop and phone.
Figure 17: Facebook newsfeed on Internet browser and mobile device (Guynn, 2018).


Although a 2020 Reuters study found that 22% of people surveyed in the US trust the news they gather from search engines, while only 14% trust news that appears on their social media feeds (Vorhaus, 2020), it is evident that individuals worldwide are continuing to seek news on their social media feeds. Chapter two provided a description of social media and discussed its impact on news consumption. It is worthwhile to reiterate some points here: 41% of Canadians surveyed in 2021 stated that they get their weekly news from Facebook (Watson, 2022). The pandemic resulted in an uptick of both television and social media news consumption, especially on platforms owned by Meta–Facebook (Facebook, Instagram and WhatsApp). An international report on news consumption asking individuals living in the UK “which of the following have you used in the last week as a source of news” found that most access their news online, especially those under the age of 35 (Newman et al., 2020, p.12). Those over 35 are more likely to watch news on their televisions while very few people get their news from print sources, so let’s look at the algorithms that drive the information presented to each person.

3.3.1 Facebook

Facebook’s algorithm considers several elements when presenting information in a newsfeed:

  • Friends’ posts the user has engaged with. Specifically, those that have been liked, commented on, or shared. Facebook will present content from these friends more often, while at the same time not presenting information posted by friends whose posts users have not engaged with for a long time (Facebook Help Center, n.d.a).
  • Content deemed to be of high interest. If an individual clicks on a certain type of post often, they will increasingly see posts with similar content (Facebook Help Center, n.d.a).
  • Likes based on previous searches. Facebook tracks IP addresses and cookies to select content for newsfeeds including advertisements (Facebook Help Center, n.d.b). When a person searches for something in Google, for instance, a cat costume for Halloween, advertisements for cat costumes will suddenly flood their Facebook newsfeed.

Users can avoid these aspects of algorithms by constantly clearing browser history and increasing privacy settings. However, some aspects of the social algorithm are unavoidable, such as when content is presented to users based on its popularity. The more people react to a post (in any way), the more it will be placed on others’ newsfeeds. Therefore exciting, funny or ridiculous false news stories tend to become viral. People become interested in the headline, so they are likely to pause while they scroll, hover over the title for a second or two, click or share (Guynn, 2018).

3.3.2 TikTok

While Facebook is the most visited social media platform among Canadian adults, Generation Z (those born after 1996) are captivated by a video platform featuring a powerful Recommendation System (Connell, 2021), TikTok. Like those used on shopping sites and streaming services, TikTok’s (2021) algorithm uses preferences and interactions to create a unique rolling feed. The user experience is created based on three drivers that decide the content presented. The first driver is the make-up of the For You page, which appears when a user opens the app. There is a constantly populating video content feed that can be changed with a swipe up. This plays a large factor in the content people continue to see (or not see). Within the For You page, TikTok tracks numerous things including accounts followed, comments made, videos liked, videos that are hidden or marked as inappropriate, and perceived interests determined by interactions with advertisements (Newberry, 2022). The algorithm records many other actions, then takes all the information and uses it to determine what users will see in the future (Newberry, 2022).

The second driver is the content searched for in the Discover Tab. This is where users can search for specific accounts and hashtags (Newberry, 2022). The third driver is the type of device used and account settings. This means that TikTok accounts for language, geography, and even what type of phone a person is using when determining what content to show them (TikTok, 2021; Newberry, 2022).

But how do the recommendations begin in the first place? If a user hasn’t used the app yet, how does it determine what content to show? When a person creates their account, they are met with several optional questions. If the user answers the questions, they will see content that aligns with their answers. If a user chooses not to answer the questions, they are met with a general suite of popular videos in their area and based on personal information they have shared (for instance, age). Their actions going forward will shape their future experience (TikTok, 2020).

There are downsides to the never-ending recommendation functions utilized by TikTok. The most concerning is that they promote addictive behaviour (Saurwein & Spencer-Smith, 2021). This is a purposeful design to try to get users to stay longer. Netflix and YouTube use similar algorithms in their autoplay software which allows users to continue watching without making a conscious choice. This results in people watching more videos than intended and engaging with unexpected content (Saurwein & Spencer-Smith, 2021). This may seem harmless and even good because everyone loves watching a great new series; however, there are circumstances where the autoplay function can be harmful. There have been multiple instances of autoplay showing violent or sexual content in the children’s version of YouTube (Maheshwari, 2017). These videos are automatically shown to kids despite the app being marketed as family friendly and despite the filter whose purpose is to remove questionable content. And these videos generate ad revenue for both the creators and YouTube (Maheshwari, 2017). Autoplay occurs so quickly that many parents may not even realize their children are watching questionable videos.

As we have seen, the online systems that seem to make our lives easier, dictate much of what we encounter in our searching behavior and on social media. Now, we will examine how automated systems can spread information and disinformation as we review a few malicious characters that thrive in this automated environment.



Icon for the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License

Disinformation: Dealing with the Disaster Copyright © 2023 by Saskatchewan Polytechnic is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License, except where otherwise noted.

Share This Book