Your Social Media Feed Is a Mirror of Your Beliefs
I have come to realize something disturbing about the world we live in over the last few years, and I do not think I am alone. We are not just disagreeing more, but we are all living in completely different realities. Same world, same events, but completely different interpretations and beliefs about what is happening and why.
Algorithms Shape Belief
Social media algorithms do not care if something is true, nuanced, or fair. They care only about what keeps you engaged. That usually means content which confirms what you already believe, or makes you angry enough to keep scrolling.
If you interact with one political take, you get more of it. If you pause on a slightly extreme video, the next one is a little more extreme. Not because someone planned it that way, but because that is how the system learns what to show you. It is a feedback loop that keeps you in a bubble, and it is very good at it.
Over time, your feed becomes a mirror. Not of reality, but of your existing views.
That is how people end up genuinely believing that everyone agrees with them, and that anyone who does not must be stupid, evil, or manipulated. Maybe just like me, you also know someone that constantly believes what they see on Facebook, and thinks what they see is the only truth, and that anyone who does not see it must be brainwashed. These people are not alone. Many people have similar feelings about their feeds.
How Belief Slowly Hardens
The scary part is that this does not feel like manipulation from the inside. It feels more like clarity.
Because when you see the same narrative repeated again and again, from different accounts, with strong emotions and simple explanations, it starts to feel obvious. Complex issues get simplified into good versus bad. Us versus them and right versus wrong.
This is also how radicalization often starts. Not with extremism, but with certainty. You do not start by believing in a conspiracy. You start by believing that you are right, and that anyone who disagrees is wrong. That is a much easier belief to arrive at, and it is the one that algorithms are really good at reinforcing.
Why Older People Are Especially Vulnerable
I feel like this part is kind of uncomfortable to talk about, but really important.
Many elderly people did not grow up with algorithmic media. They grew up with newspapers, public broadcasters, and clear editorial responsibility. When something was published, it had usually passed through some kind of gatekeeper. A journalist, an editor, a publisher. That gatekeeper was not perfect, but it provided a layer of authority and credibility. It was not just about the content, but about the source.
Social media removed that gate, but kept the visual authority. Posts look equally credible. A random Facebook page looks a lot like a news outlet if you do not know what to look for.
Add declining digital literacy, and you have a perfect storm. Older people are more likely to trust what they see, less likely to question it, and more likely to be targeted by misinformation campaigns. That is not their fault. It is simply the a consequence of how the media landscape has changed.
Critical Thinking
All of this might sound like a problem of education, or of intelligence, but I do not think that is the core issue.
Critical thinking requires slowing down, questioning intent, and being comfortable with uncertainty. Algorithms do the opposite. They reward speed, confidence, and strong emotion. So over time, people stop asking “is this really true” and start asking “does this feel right”.
When something aligns with your worldview, your brain lowers its guard. There is no deep meaning in this, as this is part of being human. We all do it. Some people just get trapped in it for longer periods of time.
The Political Cost
So what is the cost of all this? Not just the cost of misinformation, but the cost of living in different realities?
Well when algorithms shape our beliefs, they also shape our politics. When you believe that your version of reality is the only reality, it becomes very hard to compromise. It becomes very hard to even have a conversation with someone who disagrees, because they are not just wrong, they are a threat. They are not just an opponent, they feel like an enemy.
So at the end, when you vote, you are not just voting for a candidate or a policy. You are voting for your version of reality. You are voting for the narrative that you have been fed, and that you have come to believe. You are voting for the identity that you have built around that narrative. You simply try to defend your own version of reality, in order to win the argument, to feel validated, and to avoid the discomfort of doubt.
Again, human nature is not to blame here. We all want to feel right. We all want to feel like we understand the world. But when the system is made in a way to reward certainty and punish doubt, it creates a very toxic environment for democracy.
Reminder to All of Us
Not everything you see is shown because it is important.
Most of it is shown because it is engaging. It is designed to keep you scrolling, to keep you hooked, and to keep you coming back. It is not designed to inform you, to educate you, or to help you understand the world better. It is designed to make money from your attention.
Learning to question why something appears in your feed is therefore just as important as questioning what it says. That skill is poorly taught but incredibly important. It is simply about being aware. It is about recognizing that your feed is not a window to reality, but a mirror of your very own beliefs.
Because when algorithms decide what we see, believing everything that fits our view is not just naive.
Its surrender.
