One day, you hear about a trending issue, maybe an interview of a dictator’s son, so you pull up Twitter, Facebook, Tiktok, or another social media platform to learn what happened. Maybe the issue is political, religious, scientific, whatever. What matters is that your timeline says there are only two general opposing sides : the pro and the anti.
You keep scrolling, and one side dominates your timeline, and, most likely, you agree with this group. You follow these people for a reason anyway—you probably share the same principles. Still, the issue is fresh, so you try to read all sides of the debate. Eventually, you have a stand. The pack says you’re either pro or anti, so you have to pick a side depending on your pack’s ideals.
Inevitably, you see a post you don’t agree with, so you unfriend or unfollow whoever made it. Maybe you warn your friends to do the same because that person’s views do not align with your own. You keep doing this—unfriend, unfollow, repeat—until your network is just people who share the same principles. I mean, if you don’t like it, just don’t watch, right?
Eventually, the issue will die down. In the meantime, new issues arise, as they always do, and it’s the same cycle: new debates, pick a side, timeline cleanse. Your group becomes more defined, more firm with its views.
But here’s the danger with this cycle: a few months or years later, the opposing groups will become immovably firm with their stand. There has been no discourse because groups have cut off one another, so the misinformed will never be educated. Ideas never improve.
By this time, minds can’t be changed anymore. To each group, its view is correct, and the opposite is evil. All this, because these people have spent months or years in echo chambers where they confirm their biases, so any facts, statistics, and solid pieces of evidence just bounce off.
What’s wrong with that? First, the wrong never change their minds despite new evidence. History, data, and indisputable facts just won’t work. “If you don’t agree with me, you can unfollow me.”
Second, and more dangerously, echo chambers are great breeding grounds for dangerously radical ideas. Some beliefs can be subject to acceptance—some people won’t agree, and that’s okay. But sometimes, ideas evolve and develop factions with extreme beliefs they would literally kill for. They become unacceptable because they actually harm others already. Groups reach this point because no one educates, calls them out, or challenges their beliefs, so they become more certain that their view is the only correct one and everyone else must go. The safest example to state: violent extremist religious groups.
But ideas don’t need to kill to get dangerous enough to be beyond the realm of acceptance or tolerance. Some ideas encourage physical violence, spread illnesses, disrespect victims, or take away rights, just to name a few. They can bring real harm to real people. And most of the time, their minds can’t be changed anymore, so the danger never ceases.
They didn’t start out that way: it was just a “personal opinion” once upon a time. But the culture of timeline cleansing (i.e., if you don’t like it, just unfollow me) led to radicalism, intellectual pride, false dichotomies, and actual harm to real people. Objective discourse became exhausting. Everyone got tired of trying to educate one another, so we chose to resort to ad hominem and unfollowing instead.
It’s a lot easier to simply clean our timelines so we are surrounded by people who affirm our beliefs, but the consequences of this convenience can bring real harm to people in the real world. So, maybe “if you don’t like it, just don’t watch it,” isn’t the brilliant take that we thought it was. Of course, sometimes, we have to clean our timelines for our own mental health, especially those with trauma. This is understandable. But in other instances, we can help prevent these ideas from reaching dangerous levels through humble, objective discourse. Surprisingly, many people we’ve branded stubborn or unreasonable actually listen when facts are presented patiently and humbly, if only we step out of our echo chambers. Who knows? We might even learn something new.
For feedback, send an e-mail to lyca.balita@gmail.com