Meta relaxes its rules and stops the fact-checking program

Meta relaxes its rules and stops the fact-checking program

Meta, the parent company of Facebook, Instagram, and WhatsApp, has announced a strategic shift in its content moderation policies aimed at promoting free expression and easing the restrictions that were implemented in response to criticism over the spread of political and health misinformation in recent years.

In a statement titled "More Speech, Fewer Mistakes," Joel Kaplan, Meta's new President of Global Affairs, outlined key changes designed to redefine the role of digital platforms. These changes include ending the third-party fact-checking program and adopting a "community notes" model that encourages greater user participation. The company also revealed that it would relax restrictions on trending topics, focusing moderation efforts only on illegal or harmful content, while encouraging users to personalize political content to foster a wider range of opinions and viewpoints.

This shift follows criticism of the company's role in spreading misinformation related to elections and the COVID-19 pandemic. Despite previous efforts to strengthen content oversight, such as creating an independent oversight board and imposing stricter rules, Meta's policies sparked widespread debate, with some viewing them as inadequate, while others criticized them as biased or flawed. Over the past year, the company has gradually begun to scale back its adherence to these rules, with former Meta policy chief Nick Clegg acknowledging that some measures may have been excessive.

These changes represent a new vision for Meta regarding free speech and content moderation, with significant anticipation surrounding their potential impact on the global digital and political landscape in the near future.