Social Media and Truth

oleg-magni-LGNV-4-l8LA-unsplash.jpg
Photo by Oleg Magni on Unsplash

I’ve talked a lot on the blog before about the future of truth in relation to propaganda and deepfakes. Last time I touched this topic, I discussed the latter. Now, I want to dive into why it’s so hard to remove bad content from social media in the first place.

Let’s say we had a perfectly binary moderation system. In our sandbox system, we say that there is one rule, and that rule is that “You cannot post nude images”. We have a perfect moderation bot that is able to check with 100% accuracy whether an upload contains a nude image, and if so automatically deletes the upload and bans the user.

Well, wait a minute. That’s fine and dandy, but is every nude image necessarily bad? What about artistic nudity, like from a famous painting? Or an image of the Napalm Girl from the Vietnam War? Are these really under the same category as straight-up pornography? The answer is, well, it can be. For example, if I’m running a children’s website, the answer is straight and simple: no nude images, no exceptions. But what if we were running something like Facebook or Twitter instead? Well, in this case, that rule might make people a bit angry and left feeling talked down upon. So let’s say, “You cannot post nude images, with some exceptions.”

The point is that porn is something that, as Congress would say, “You know it when you see it”. It is immensely difficult to codify “You know it when you see it” into a rules system. There are exceptions to everything. The way I see it, there is a range. On this range, there are two extremes: one extreme is exactly what I described, where certain items are banned no matter what. This is more something you’d see enforced on a kid’s site, where no nudity is good nudity. On the other hand, there is the extreme of having only exceptions. That is, anything goes. In this case, we have sites like 8chan or Gab, where the only rules that are enforced are to keep the website’s servers from getting shut down.

Based on this range, sites like Facebook and Twitter try to ride somewhere in the middle. Get to close to no exceptions, and people will think you’re website is condescending and treating them like children. Get to close to full exceptions, and real-world violence and trouble start to come out of your site. The question here, however, is if there’s even a middle at all.

Theoretically, there is. In our binary example, we could say something like “No nude images, unless they are for historical or artistic reasons.” But, then we’d have to go and define in detail what ‘historical’ or ‘artistic’ mean in terms of our site. Still, that’s not too bad; it would take us 2-3 hours at most to think of all possible cases. Now, however, try extrapolating that to the thousands of different micro-sections a site like Facebook has to moderate. And take in mind that all those definitions are changing by the day. Starts to become a lot more difficult.

So, when it comes to solutions to this problem, I have only one good idea in mind. It’s called discretion.

Discretion is vague. Discretion isn’t sexy. It isn’t transparent. It pisses people off. However, when it comes to solving the problem of exceptions in rules, nothing else comes close. It is better to make good discretions and then build a community who appreciates those discretions than to make a system of endless rules that will just get broken in the end anyway.

Anyway, that’s all for this one. If you want to keep in touch, check out my biweekly newsletter! Following this will give you the low-down of all the new stuff I’m working on, as well as some things I found interesting. As an added bonus, you’ll also receive the Top 10 Tools I Use on a Daily Basis to help better manage your workload and do higher quality work in a shorter amount of time.

Subscribe here!

One thought on “Social Media and Truth

Leave a Reply

Discover more from Jacob Robinson

Subscribe now to keep reading and get access to the full archive.

Continue reading