How much of what you believe is based on what you saw, versus what an algorithm decided you should see?
As technology advances, personalized algorithms increasingly shape our perspectives, often without us even realizing it. These automated systems mold individual values, sometimes leading to social disruption and misinformation. Society is so used to this that most people are blinded by the fact that it is happening.
Research from “Tomorrow Bio” highlights how algorithms, while optimizing content for engagement, contribute to mental health issues by pushing addictive behaviors and narrowing exposure to diverse viewpoints.
In the section where the author discusses how algorithms shape online communication and influence human behavior, they emphasize the unintended social consequences of algorithmic filtering. As the article explains, “People’s daily interactions with online algorithms?affect how they learn from others, with negative consequences including social misperceptions, conflict and the spread of misinformation, my colleagues and I have found.”
This highlights the growing concern that the personalized nature of algorithms can distort understanding.
Building on this idea, the article also shows how users are often unaware of the extent to which their social media experience is curated. The author explains, “Algorithms determine in part which messages, which people, and which ideas social media users see.”
This quote reinforces the point that algorithms play a powerful role in shaping what people believe and how they interact, not by showing the full picture, but by presenting an untrue version of reality.
Personalized algorithms are systems that analyze your past behaviors, such as your clicks, likes, and shares, to predict and show you content that will engage you the most. These systems power everything from your social media feed to recommended YouTube videos and targeted ads.
There are a multitude of reasons why personalized algorithms can be bad, not only for students/teenagers but also for adults. According to the “Tomorrow Bio 4.0,” algorithms “play a crucial role in shaping the online experiences of social media users. They determine which posts, images, and videos are shown to us, and in what order. By analyzing our past interactions and behavior, algorithms try to predict what content will engage and captivate us the most.”
Among college students, this issue is especially visible. Many students are addicted to phones. As I am a first-year college student, I often see many students check their phones every few minutes, even without getting a notification, almost like a habit or reflex.
Another great example of this is when the professors give the students a quick 5-minute break to prepare for a quiz or heavy content subject, however, the students use this time to go on their phones and scroll through social media, thus taking away time to review for a quiz and or a new subject. Students are only addicted because there are algorithms that tailor every video that these students are watching to fit their personal beliefs, which keeps them attached to the phone.
This has impacted student interactions, reduced face-to-face conversation, and narrowed how students express complex ideas. And limiting people’s need to interact with one another, as they already have all the entertainment on their phones. Engaging with diverse perspectives and opinions via social media can help you expand the perspectives that you see online.
Personalized algorithms have the potential to expose individuals to content that only aligns with their ongoing beliefs and interests. This can give people a sense of comfort and reliability, making them more attached to social media but at the same time, it is limiting their exposure to diverse perspectives and new ideas. This phenomenon is known as an echo chamber.
An echo chamber is a space where people only hear opinions that match their own. Because they keep hearing the same ideas over and over, their beliefs get stronger, and they don’t get exposed to different viewpoints.
Possible solutions for these conflicts include stronger regulation, explainable AI, and increased data transparency, starting with regulation, adding regulations that hold platforms accountable for the impact of their personalized algorithms on public records and individual beliefs. Explainable AI promotes the development of AI systems that are more transparent and understandable, allowing users to see and fully understand how decisions are made within these personalized algorithms. Lastly, data transparency requires platforms to provide users with clear and easy-to-access information about how their data is collected, used, and shared.
The influence of algorithms on our beliefs and behaviors isn’t always recognizable, but it’s deeply powerful. Recognizing this influence is the first step.
To reclaim control, we must challenge what we see online, seek out diverse viewpoints, and hold tech platforms accountable for the content they prioritize. Awareness, combined with action, is how we break free from algorithmic manipulation.
Resources
Scientific American, 2023. Social Media Algorithms Warp How People Learn from Each Other. https://www.scientificamerican.com/article/social-media-algorithms-warp-how-people-learn-from-each-other
Tomorrow Bio, 2023. Algorithmic Identity and Mental Health: The Impact of Social Media Algorithms on Well-Being. https://www.tomorrow.bio/post/algorithmic-identity-and-mental-health-the-impact-of-social-media-algorithms-on-well-being-2023-10-5364893362-futurism#:~:text=Algorithmic%20curation%20has%20the%20potential,to%20diverse%20perspectives%20and%20ideas