For years, discussion has centered around whether or not social media companies should be held accountable for the content on their platforms. After the testimony of the latest Facebook whistle blower, the conversation might be shifting. The switch is to hold social media companies accountable for the content they show us. This instead of being accountable for the posted content.
As I’ve blogged before, social media companies need two things from us, their humble users. They need content we create. Secondly, they need hours of our unwavering, absorbed attention. They accomplish this through myriad ways.
Creating, sharing, and interacting with content is an easy sell for most users. This is seemingly a win-win scenario. Users gain benefit from the social media sites by creating, or ingesting, content. Alternatively, the social media sites have content that attracts people. However, that’s where the second part of the equation comes in. Social media companies needs us to want the content on their sites, more than any other place. There’s a lot of competition for our attention. Beyond devices, we also have competition from everyday life like jobs, hobbies, families, etc. But Facebook wants to make sure that when we’re online, it’s synonymous with being on Facebook.
One of the ways Facebook, and other social media companies, accomplish this is through algorithms. They use algorithms to make decisions for us about what content we should, or shouldn’t see, in our feeds. It’s based on many factors, one being what content is most likely to elicit a reaction. When we react to content, we’re more likely to engage with it. This means we’re staying on the site longer.
Another factor considered is how we will react to the content. According to the whistle blower, Facebook determined content generating negative reactions is more likely to keep us engaged. The quality of that engagement, however, is what the discussion needs to be focused on. By constantly showing users content that encourages a strong, negative response, such as anger or outrage, Facebook has figured out how to keep us on the site longer. YouTube employs a similar tactic to continually show viewers “more of the same.” In reality, the content progressively gets more extreme. This leads the viewer down one path, or another.
So what’s the solution? More regulations? More transparency? Who knows. It’s a story in progress. We’re all guinea pigs.