When social media started, it was different from how it is now. Social media was a new way for people to communicate. To share ideas, disagree with one another, and have open discussions. Facebook, for example, was for people to connect with one another. People could easily maintain contact, get updates, and make “friends”.
As more people used social media, the scope of it expanded. People formed groups, advertised business, influenced others, shared photos, etc. Some people presented an ideal portrait of their lives, giving rise to FOMO, “Fear of Missing Out.”
What started out as a platform for connecting and sharing different perspectives quickly devolved into something else entirely. Instead, social media has become a place for people to maintain or promote narrow viewpoints. To gain support for them or use them to influence others.
Social media companies compete for an important commodity, our attention. In order to keep us engrossed and addicted to our accounts, a number of tactics have been used.
One strategy is to use algorithms to show us content we’re sure to like. The algorithm might be based on our past selections. Or by displaying content that others “like” us may have viewed.
I know lots of people who love this personalization. Why would anyone want to spend their time sifting through irrelevant content? Or posts not aligned with their interests?
However, the downside of this system is people are never exposed to anything different. Or content that is contrary, challenging, thought provoking, etc. Whatever you like, social media companies make sure you get more of the same.
Algorithms push people into self-reenforcing content. It’s easy for people to always see the same types of things. As opposed to being exposed to a wide variety of opinions and viewpoints. This is problematic.
I often read articles about misinformation, disinformation, and hateful content promoted through social media. The “solution” is for social media companies to discover and eliminate this type of content, by using algorithms. However, it doesn’t really solve the problem. Instead, it pushes these viewpoints to other platforms. Places where they can grow and gain mass followings under the radar.
Perhaps instead of pushing out “banned” content, the algorithms could be readjusted to offer a diverse range of viewpoints instead. Give users something to consider rather than spoonfeeding them more of the same.