W9.1: Social Media

Due Nov 1

Social Media: Activism, Algorithms, Echo Chambers

Led by Teresa, Lucy and Lauren

Skim through the following articles and extract something interesting from each one:

  • Oremus, Will, et al. “How Facebook Shapes Your Feed.” The Washington Post, WP Company, 26 Oct. 2021: link.
  • Murkett, Kristina. “‘Simplifying Complicated Issues Can Be Tricky’: The Problem With Social Media Infographics.” The Stylist, 27 July 2021: link.
  • Sunstein, Cass R. “Ch 3, Polarization.” #Republic : Divided Democracy in the Age of Social Media, Princeton University Press, Princeton, NJ, 2017, pp. 59–69: link

How should we balance information overload and creating echo chambers/polarization? Between posting about a cause to raise awareness and performative activism?

Address this quandary in a comment. Feel free to draw on your prior reading and experience, but if possible make explicit reference to one of the readings assigned above.

7 responses to “W9.1: Social Media

  1. In Murkett’s article “‘Simplifying Complicated Issues Can Be Tricky’: The Problem With Social Media Infographics”, she discusses the balancing act that is required between restricting misinformation and promoting conversation. One way to combat echo chambers/polarization while also maintaining this balance is to follow an account or engage with a post of someone who may hold different views than you or offers something different than you on social media. Engaging with posts of someone who you may not agree with can promote conversation between these people, which Murkett regards as essential, and can broaden views to prevent total polarization. Additionally, it would shift the algorithm to encompass a wider range of posts of many different views. Murkett also examined concerns over the misinformation that many posts spread, but this strategy also promotes people to do their own fact checking. If they are skeptical of someone’s information, they are more likely to do research on it. It essentially combats confirmation bias. From personal experience, it is clear that many people have fallen into the “block the haters” trap, but I think the benefit of engaging with haters (perhaps not the hostile ones) is easily overlooked. We need to start considering what we can learn from people with different perceptions of the world, even if it doesn’t change our own views.

  2. As echo chambers become increasingly more prevalent with the role of social media in today’s society, a question is raised on whether the platforms themselves, such as Instagram or Facebook, should take matters into their own hand to combat this issue. As discussed within Oremus’s “How Facebook Shapes Your Feed”, the article discusses how the algorithm affects what a person sees on their account. And there are ways to diversify what a person sees. The article mentions that completely cutting off non following posts leads to the domination of already large accounts and hinders the ability of smaller but good idea to trend but also by placing things that the user may not agree with for the sake of diversity, the platform puts their engagement at risk. Thus, I feel as though the responsibility still lies on the platforms themselves to use their algorithms to decipher the most efficient way to maximize both engagement without the repercussion of echo chambers. Although engagement may decline at the prevalence of opposing views on ones feed, it can be used to further educate the user, whether it be to strengthen their current views or introduce them to new perspectives.

  3. Facebook’s tactic towards pushing forth content tailored for the user is a task that is only accomplished through the use of their built-in algorithm. However, it is with this algorithm that the dichotomy between information overload and the formation of echo chambers becomes evident. Will Oremus’ “How Facebook Shapes Your Feed” effectively addresses some of the “behind-the-scenes” work within Facebook and how they bring billions of users their unique feed every day. Notably, “today’s algorithm can turn feeds into echo chambers of divisive content and news, of varying reputability, that support the outlook [of users].” While it may be up to the user to decide which posts they want to spend their time looking at, the fate of the feed is ultimately up to Facebook, as they hold the power within the algorithm. As a result, the solution for a balance between spiraling into polarization and becoming burdened by social media is rooted in how the user decides to interact with the social media they use. Furthermore, reoccurring issues have arisen with becoming an advocate on social media, as the idea of “clickbait” comes into play, where a user creates a false headline with the intent of garnering attention or views; essentially, it is just a publicity stunt that fabricates an attempt to raise awareness. The other spectrum becomes chaotic in the sense that conflicts are produced through disagreements with performative activism on platforms. The solution to this issue is similar to the previous problem, but this time, the responsibility is solely on the community, and the organization can only regulate to an extent or outright end the conversation. The need for continued effort from users of social media must become widespread, or else each individual will succumb to the detriments of the big organizations.

  4. It is undeniable that social media leads to polarization. This is arguably due to the content for which people see on their feed due to an algorithm. In my personal experience, algorithms have made social media more of a guaranteed entertainment source as my feeds are personalized to me. However, in the article by the Washington Post, it stated that, “The downside of this approach was that the posts that sparked the most comments tended to be the ones that made people angry or offended them, the documents show. Facebook became an angrier, more polarizing place.” Evidently, with an algorithm people are often exposed to content of their liking thus prompting enthusiastic feedback which can heavily oppose and differ from other people’s opinion. Consequently, Murkett’s article mentions that, “If you are a member of a particular political party or have strong convictions, you might want support, reinforcement, and ammunition, not criticism.” Therefore, because people want to be surrounded by people that agree with them, enthusiastic feedback given on posts can lead to polarization since people will respond negatively with that same amount of passion. Social media inhibits tolerance of disagreement. Additionally, information overload is brought into question with the Princeton Article when it states that, “Infographics have clearly created a new intersection between art, politics and social media, and whether you view them as shallow, aestheticised examples of virtue-signalling or brilliant, bitesize drivers of social and behavioural change.” In today’s society, people are expected to know much of what is going on and so infographics are appealing since it seems more doable to know everything that is going on if you receive your information in short phrases and pictures. However, this could be utilized to promote balance between info overload and polarization if these infographics were less biased or maybe incorporated varying perspectives.

  5. When reading “How Facebook Shapes Your Feed,” I found it interesting that a move that was supposed to be innovating ended up becoming detrimental for our society. With the algorithm being implemented, the News Feed was supposed to become a place for users to continue to find diverse content that may catch their interest. However, due to Facebook allowing their personal desires to guide the decisions they make, they place trendy articles at the top of the feed for everyone to see. There, likes and comments will increase, along with more interactions from users. However, with Facebook having such power, users have been distraught with information and articles that they are not interested in or dislike. In this case, I found the solution Frances Haugen proposed interesting: removing the power to control our feed completely from Facebook. This removal of power would allow the app to return to its basis: an app to connect with close ones and to meet new people. I agree with this proposal. There are other platforms for politics and its problems, and Facebook should not be one of them.

  6. Part of the reason “social media activism” is such a complex issue and topic is because the intent of the user often does not align with the impact. Some individuals may genuinely try to bring light to a particular issue they are talking about, but end up continuing the circulation of misinformation in the form of pre-created posts and infographics. Conversely, I have seen many people use instagram activism as part of an aesthetic, for example making a highlight that perfectly fits with their feed. In both ways, misinformation can spread very easily and very fast. Additionally, when a certain post or graphic has been shared over and over again it can come across as more reputable than it actually is. I think that Murkett’s article “Infographics are all over Instagram, but can we believe what we see?” addresses this issue quite well, as she discusses the issue from both sides, those who view infographics as “shallow, aestheticised examples of virtue-signaling” (Murkett) as well as those who believe they can be used to create genuine social change and movement. I also thought that “#Republic : Divided Democracy in the Age of Social Media” brought up a good point about how people “enjoy, “appropriately slanted” stories about the events of the day” (Sunstein). I think this is especially applicable to social media, as People are more willing to spread information that they feel fits with their preconceived ideas or values, even if it slightly strays from the truth. While there is no clear answer on how to combat echo chambers and misinformation, I believe the takeaway from both of these articles can be that it is important to do research yourself and form your own opinions based upon this. From here, it is important to think about the potential impact of posting on social media and be especially wary about misinformation that you may encounter.

  7. As expressed by the Washington Post, social media outlets like Facebook use what is known as the “algorithm” to filter what content users receive. In doing this, a tunnel of directed and biased content is created for a lot of people. I found in interesting, even concerning, to explicitly read about the Facebook employees who “decide what data sources the software can draw on.” I see this as another layer of bias added to the content we are consuming. When thinking about trying to balance the echo chamber that is created, I think about the studies that were noted in Cass Sunstein’s piece. If people are more likely to consume information based off of where it is coming from, rather than what the content topic is, then counteracting the echo chamber would call for a restructuring of the main organizations pushing out information. On a personal level, I would say that knowing bias exists and being willing to use multiple sources from differing opinions to form your own ideas would give you a more well-rounded understanding that isn’t only fueled by a slideshow of infographics from Instagram.

Add a Response

Your email address will not be published. Required fields are marked *

Separate ¶s with TWO returns.