Meta has ended its third-party fact-checking program in the U.S., introducing Community Notes on Facebook, Instagram, and Threads to empower users in content moderation.
Meta has officially terminated its third-party fact-checking program in the United States, transitioning to a community-driven approach known as Community Notes across Facebook, Instagram, and Threads. This strategic shift aims to enhance content moderation by leveraging user contributions.
The Community Notes feature allows users to add contextual information to posts, providing a broader spectrum of perspectives. This model mirrors a similar system implemented by X (formerly Twitter), emphasizing Meta’s commitment to fostering free expression while addressing misinformation.
Joel Kaplan, Meta’s Global Policy Head, stated that the initial rollout of Community Notes will be gradual, with no immediate penalties attached to flagged content. This phased implementation reflects Meta’s intent to refine the system based on user feedback and engagement.
This move aligns with CEO Mark Zuckerberg’s vision to reduce content moderation errors and promote a more open discourse environment. By decentralizing fact-checking responsibilities, Meta aims to mitigate biases associated with centralized moderation and empower its user base to play an active role in maintaining platform integrity.
While the Community Notes feature is currently exclusive to the U.S., Meta has expressed intentions to expand this initiative internationally, adapting the model to diverse user communities and regional contexts.