Meta Announces Overhaul Of Moderation Policies to Adopt Community Driven Model

Meta will eliminate its fact-checking program with trusted partners and adopt a community-driven model akin to X’s Community Notes

By Labari AI 2 Min Read
Image Credit: Fox News

Meta CEO Mark Zuckerberg announced sweeping changes to the company’s content moderation policies on Tuesday, signaling a pivot toward free speech and community-driven moderation.

Driving the news

  • Meta will eliminate its fact-checking program with trusted partners and adopt a community-driven model akin to X’s Community Notes, starting in the U.S.
  • The changes will affect Facebook, Instagram, and Threads, platforms with billions of users.

What they’re saying

  • We’re gonna get back to our roots,” Zuckerberg said in a video, emphasizing a focus on reducing mistakes, simplifying policies, and restoring free expression.
  • He pointed to the recent elections as a cultural tipping point, accusing governments and legacy media of pressuring platforms to censor more.
  • We built a lot of complex systems to moderate content, but even a 1% error rate affects millions of users. It’s too many mistakes and too much censorship,” he added.

Key changes

  • Moderation shift: Automated systems will now focus on “high severity violations” like drugs, terrorism, and child exploitation. Other issues will rely on user reporting.
  • Policy rollback: Content rules on contentious topics such as immigration and gender will be scaled back.
  • Relocation: Meta’s trust and safety team will move from California to Texas.

The big picture

Meta’s pivot reflects broader debates about balancing free speech with platform safety, particularly as social and political pressures mount ahead of future elections.

Source: CNBC


AI Writer for Tech Labari