Why content moderation costs billions and is so tricky for Facebook, Twitter, YouTube and others

Technology

After the riots at the Capitol on January 6, debate is swirling over how platforms moderate content and what is protected as free speech.

It’s a messy and expensive process, with Facebook spending billions to review millions of pieces of content every day. While TikTok directly employs content moderators, Facebook, Twitter and YouTube outsource most of the grueling work to thousands of workers at third-party companies.

Many moderators in the U.S. and overseas say they need higher pay, better working conditions and better mental health support because of the terrible things they see while sifting through hundreds or thousands of posts every day.

In response, some companies are relying more on algorithms they hope can take over most of the dirty work. But experts say machines can’t detect everything, such as the nuances of hate speech and misinformation. There’s also a host of alternative social networks like Parler and Gab that rose to popularity primarily because they promised minimal content moderation. That approach led to Parler’s temporary suspension from the Apple and Google’s app stores and Amazon Web Services hosting.

Other platforms, like Nextdoor and Reddit, rely almost exclusively on large numbers of volunteers for moderation.

Watch the video to find out just how big the business of content moderation has become and the real world implications of the online decisions social networks make about what content we can and cannot see.

Products You May Like

Articles You May Like

Rep. Wild discusses outcome of ethics meeting over Gaetz report
That Time Josh Brolin Didn’t Realize Denzel Washington Was Fully In Character And Touched Him On The Shoulder. What Happened Next Was Wild
Mini Consequence Crossword: “Crossover?”
One Of The Unhealthiest Beef Stock Brands Is Already In Your Pantry
Conservation Expert Warns Earth Is In Midst Of Sixth Great Extinction, Calls for Immediate Action