Zuck admits Facebook has been "censoring too much" and ends fact-checking practices
I'm not sure that when a butterfly flaps its wings in the Amazonia, a storm ravages on the other side of the globe; but the butterfly effect is pretty much real: we're experiencing it right now. For example, Trump won the November 5, 2024 elections and now, in the first week of January 2025, Zuckerberg puts an end to the fact-checking program on Meta's social platforms.
Meta plans to phase out its third-party fact-checking program in the United States and replace it with a Community Notes system, inspired by a similar approach on X. Wait, wasn't X – and Elon Musk himself – a bad, toxic platform (and a mean agent, respectively)? Why would Meta do what X does?
Let's not forget another one of Zuckerberg's recent gems:
So, the new Community Notes system is claimed to empower users across diverse perspectives to collaboratively identify potentially misleading posts and provide additional context. Meta will not create or decide which Notes appear; instead, Notes will be written and rated by users, with safeguards to ensure balanced input from varied viewpoints. Meta also plans to be transparent about how different perspectives contribute to the Notes shown on its platforms.
Initially, Community Notes will roll out in the US over the next few months, with plans to refine the system throughout the year. Users can already sign up on Facebook, Instagram, or Threads to become early contributors. As the transition progresses, Meta will end its current fact-checking controls, stop demoting flagged content, and replace intrusive warnings with subtle labels linking to additional context.
The goal of this shift is to provide users with better tools to evaluate content while minimizing bias and avoiding censorship, aligning more closely with Meta's original vision of promoting informed online engagement.
Personally, I remain skeptical about it until I see solid proof of the aforementioned plans and intentions; but I can't deny that it's a step in the right direction. The fact-checkers did more harm than good, as even Zuck himself points out.
The only question I find myself asking is: would Zuckerberg do the same thing, had Trump not won the November elections?
We’re getting rid of a number of restrictions on topics like immigration, gender identity and gender that are the subject of frequent political discourse and debate. It’s not right that things can be said on TV or the floor of Congress, but not on our platforms. [...] In recent years we’ve developed increasingly complex systems to manage content across our platforms [...]. This approach has gone too far. As well-intentioned as many of these efforts have been, they have expanded over time to the point where we are making too many mistakes, frustrating our users and too often getting in the way of the free expression we set out to enable. Too much harmless content gets censored, too many people find themselves wrongly locked up in “Facebook jail,” and we are often too slow to respond when they do.
– Meta Newsroom, January 7, 2025
Meta plans to phase out its third-party fact-checking program in the United States and replace it with a Community Notes system, inspired by a similar approach on X. Wait, wasn't X – and Elon Musk himself – a bad, toxic platform (and a mean agent, respectively)? Why would Meta do what X does?
The original fact-checking program, launched in 2016, aimed to provide users with additional context about online content through independent fact-checkers. However, Meta acknowledges that biases and misjudgments in fact-checking led to unintended censorship of legitimate political speech and debate, undermining the program's goals. That's put mildly, though.
Let's not forget another one of Zuckerberg's recent gems:
So, the new Community Notes system is claimed to empower users across diverse perspectives to collaboratively identify potentially misleading posts and provide additional context. Meta will not create or decide which Notes appear; instead, Notes will be written and rated by users, with safeguards to ensure balanced input from varied viewpoints. Meta also plans to be transparent about how different perspectives contribute to the Notes shown on its platforms.
Initially, Community Notes will roll out in the US over the next few months, with plans to refine the system throughout the year. Users can already sign up on Facebook, Instagram, or Threads to become early contributors. As the transition progresses, Meta will end its current fact-checking controls, stop demoting flagged content, and replace intrusive warnings with subtle labels linking to additional context.
Personally, I remain skeptical about it until I see solid proof of the aforementioned plans and intentions; but I can't deny that it's a step in the right direction. The fact-checkers did more harm than good, as even Zuck himself points out.
The only question I find myself asking is: would Zuckerberg do the same thing, had Trump not won the November elections?
Things that are NOT allowed: