Prioritizing Truth: Who’s Responsible?

There’s no shortage of new features being added to Facebook, which users are arguably interested in (looking at you, Facebook stories). There is, however, one constant change on the platform that may be worth paying attention to. Cue ~the algorithm~. Originally showing content on the newsfeed chronologically, Facebook has transitioned to prioritizing popular posts, to relevant posts, to video posts and so on. In recent news, the platform aims to “emphasize meaningful interactions” which – hopefully – means we’ll stop being served the same ad 100 times for the shoe we searched for once.

So, why isn’t this necessarily good for democracy? As this PBS article explains, “Facebook does not optimize for truth, for learning, or for civil conversation.” The problem, in my opinion, is that the platform puts the responsibility on the user to detect and report fake news. Furthermore, it’s becoming the users job to also manually prioritize what they do want to see.

Though I don’t have first-hand experience using these tips, such as clicking the new “report” or “see first” buttons, they seem, for lack of a better word, sketchy. Being asked to take my newsfeed into my own hands reminds me of those old hoax claims that Facebook would delete my account if I didn’t copy and paste X message into my status. No, just me?

I believe Facebook needs to put less of an emphasis on engagement measurements, and more on truth. Emphasizing meaningful interactions is a step in the right direction, though it will be interesting to see how marketers respond to this change. As someone going in to brand-side social media, I will have to ask myself how to authentically interact with followers without it feeling like engagement bait. I’ll end this post with one ad that I think gets it right:

Leave a Reply