
Meta, the parent company of Facebook, Instagram, and Threads, is shifting its strategy by eliminating third-party fact-checkers in favor of a new Community Notes program, inspired by the approach on X. This change was detailed in an announcement made by Meta’s policy chief Joel Kaplan. Additionally, Meta is relocating its trust and safety teams from California to Texas.
According to Meta, the Community Notes feature aims to empower users to identify potentially misleading posts, allowing a diverse range of perspectives to offer helpful context. This approach is anticipated to enhance the original goal of providing users with relevant information while reducing the risk of bias.
The rollout of the Community Notes feature will commence in the US over the next couple of months. It will provide users with an unobtrusive label indicating the availability of additional information on posts, replacing the full-screen warnings that previously required user interaction. Similar to the feature on X, Meta asserts that its Community Notes will necessitate consensus among users with varying viewpoints to minimize biased ratings.
These moderation changes aim to respond to criticisms that Meta excessively censors benign content and is slow to act on user account restrictions. Unlike Elon Musk, who relocated the headquarters of SpaceX and X to Texas, Meta is only moving its trust and safety teams responsible for content policies and reviews out of California.
Meta is also planning to lift several existing restrictions concerning topics like immigration and gender identity, while gradually reintroducing political content into user feeds across Facebook, Instagram, and Threads with a more tailored approach.
While Meta will continue to use automated moderation systems, these will now primarily target severe policy violations such as terrorism, child exploitation, drug-related offenses, fraud, and scams. For less severe violations, community members will be responsible for reporting incidents before Meta acts on them. Furthermore, many of Meta’s automated content prediction systems will be discontinued.
Meta states that these adjustments are aligned with the commitment to free expression articulated by Mark Zuckerberg in his Georgetown speech. The company emphasizes the importance of evaluating the impact of its policies on users’ ability to voice their opinions, and the necessity of adapting its approach when necessary.