How Reddit avoids content moderation woes of Facebook, Twitter and YouTube

In This Article:

Content moderation is the issue that won’t go away for YouTube (GOOG), Facebook (FB) and Twitter (TWTR). As the social media juggernauts face harsher and harsher scrutiny and pressure to police the billions of posts on their sites, smaller players are keeping close watch of the regulatory action.

Hate speech, conspiracy theories, and bad content have proliferated on Reddit since the site was founded 14 years ago. It has a massive reach of 330 million monthly active users yet manages to stay out of the intense spotlight shining on its publicly traded competitors. While the tech giants have been called to testify in congressional hearings about hate speech and white nationalism, Reddit has mostly stayed out of the consciousness of regulators.

“It's a 14-year-old company that's had years in there that were kind of wild. And it's taken since [founder and current CEO] Steve [Huffman] came back in 2015-2016, we've been on a very, very good path,” said COO Jen Wong in an interview with Yahoo Finance’s Breakouts last week.

Despite a lot of scrubbing, bad actors remain rampant on Reddit. Most recently, Philadelphia news anchor Karen Hepp sued Facebook and Reddit, among other platforms, for a photo from a convenience store captured on a security camera that's now being used in dating and erectile dysfunction ads.

When asked how the site can ensure that Reddit doesn’t house such content, Wong echoed the go-to talking point the tech industry has been using — the community has a certain level of responsibility.

Reddit Content Policy (Reddit)

“We take a different approach than any other platform, and we're pretty unique. Our approach is layered moderation, and this is something that I think is getting better all the time. At the base of it, Reddit, Inc. writes policies and rules — think of it as the federal government — and enforces those rules. We build tools to enforce those rules, and that's the layer that actually builds tools for our communities, as well,” she said.

“The second piece that's really unique to Reddit is community. So every community has human moderators. It has its own rules. Think of it as, like, states' rights that sit on top of the federal government. They provide moderation for the content, and what that does is it means that we have joint responsibility for the health and safety and the vibrancy of Reddit. And they share the burden, and they write their rules, and they police their own rules, and we give them tools to do that. So it's very unique how we approach it, but the concept is that we share the burden with the community. And what that allows for is for them to have nuance in what the rules are. We don't adjudicate all the rules. The communities in their contexts can set rules and adjudicate and apply, and we can too, and we do that together. And it's very different,” she said.