Lately I’ve been reading articles about Meta changing some of its policies. One that really stood out to me was that they no longer want to monitor and fact check on Facebook. According to their about page providing a summary, they claim the restrictions on the platform went too far. They went too far to the other extreme and ending up restricting the “free expression” they set out to enable in the first place.
Social media hasn’t been around that long, but its far reach and speed make it something that needs careful consideration. Less than 10 years ago, content published on Meta (formerly Facebook) was found to have an impact on the outcome of the US Presidential election. There are numerous articles and research on the effects of social, political, and otherwise harmful content posted on Facebook, and others. However challenging these issues are to address, it’s their responsibility. Claiming “free expression” and backing off is an abnegation of their burden.
Some years ago, the argument used to justify any and all content posted was similar to that of a news stand. The argument, as I understand it, was in defense of the news stand. The news stand is only displaying content for sale, but not directly creating the content themselves. This line of thought extended itself to the social media companies. After all, they were only providing a platform for others to express themselves. However, in my mind, the big difference is that news stands display published content. This means someone reviewed and edited the content before publication. Whereas social media companies let anyone and everyone post whatever they feel like. This can be far ranging from silly things, such as funny animal videos, all the way to other extreme. Social media is a hodge podge of political content, medical advice, instruction, cooking shows, tours of people’s refrigerators and everything in between. The content has no direction or limit. All in the name of “free expression.” But how can we call it “free expression,” when some content can be so harmful to others, or illegal?
This new direction of Meta, and other social media companies, is going to lead to other problems. In many instances, these same companies are the leaders of Artificial Intelligence. Now, these AI models are going to be trained by harmful, inaccurate, and damaging content available.