This article is a part of Poland Unpacked. Weekly intelligence for decision-makers
A year ago, Meta announced it would discontinue its cooperation with fact-checkers. Today, the company says it has no concrete plans to end its fact-checking program in Poland or other European markets.
“More Freedom of Speech, Fewer Mistakes” – that is the title of a piece published on Meta’s platforms on January 7, 2025. The article, accompanied by a lengthy, over five-minute address from the company’s founder, Mark Zuckerberg, discussed the announced changes in Meta’s approach – and that of its social media platforms, such as Facebook and Instagram – to content moderation.
“We have reached a point where we are making too many mistakes and applying moderation too aggressively. The recent U.S. presidential elections were also meant to serve as a cultural turning point, where freedom of speech takes center stage once again,” Mark Zuckerberg said.
Is fact-checking censorship?
Mark Zuckerberg’s statement was met with an immediate response from the European Fact-Checking Standards Network, an organization bringing together entities engaged in information verification. In a published document, the network strongly criticized the Meta CEO’s position, which equated fact-checking with censorship.
At the time, fact-checking organizations called on European Union institutions to take action to curb the spread of false information and disinformation on the largest online platforms.
Is Meta changing its mind?
In April 2025, Meta officially ended its collaboration with fact-checkers in the United States. Fact verification conducted by independent partners, operating according to transparent methodologies, was replaced by “community notes,” similar to those on Elon Musk’s platform X. At the same time, Meta announced it would ease restrictions on discussions around “certain topics.” Plans included a more personalized approach to political content, including expanding its reach.
“We are beginning to roll out community notes in the United States. We will develop them over the next year before expanding their availability to other countries,” Meta stated at the time.
The shift in approach in the United States became a reality. In Europe, including Poland – as we have confirmed – it has not.
Meta is not leaving its partnerships with Polish organizations
Currently, Meta collaborates with fact-checking organizations in over 100 countries. The list includes European states, including Poland. It does not, however, cover the United States, where – according to Mark Zuckerberg’s statements – the program was suspended in the first quarter of 2025.
In Poland, Meta’s partners include two organizations: Demagog and AFP Polska. Both have worked with Meta for years. We asked Meta representatives whether, following Zuckerberg’s announcement last year, the company had ended such collaborations in Europe and Poland.
Meta’s press office informed us that the company has not changed its approach to combating disinformation. Meta continues to work with independent fact-checking partners certified by the European Fact-Checking Standards Network (EFCSN) or the International Fact-Checking Network (IFCN).
Meta representatives confirmed that Demagog and AFP Polska continue to serve as independent fact verifiers. The company did not comment on the duration of its agreements with these organizations or whether it plans to extend them.
Demagog and AFP Polska confirm their collaboration with Meta
“Demagog Association continues to work with Meta under the Third-Party Fact-Checking Program. As part of this program, we verify content published on Facebook and Instagram,” said Marcel Kiełtyka, Director of Communications and PR at Demagog Association.
There are strong indications that Meta renewed its partnership with Demagog after Zuckerberg’s statement equating fact-checking on Meta platforms with censorship.
Marcel Kiełtyka declined to discuss the details of the collaboration, citing the confidentiality provisions (NDA) of the agreement with Meta.
We also contacted AFP Polska, which confirmed its participation in Meta’s external fact-checking program. Journalists from Agence France-Presse verify content published on Facebook, Instagram, and Threads in 26 languages worldwide, including Polish.
Changes take effect – But not in Europe
Meta’s press office confirmed that the changes announced in January 2025, concerning the rollout of Community Notes, currently apply only in the United States. No changes have been implemented in other markets.
Meta also did not provide any concrete plans or timelines for ending the fact-checking program in Poland or other European countries. According to the company, development work on the system continues in the U.S., and the mechanism itself is described as complex but effective.
The mass rollout of the Community Notes system in the United States began in November 2025. In Meta’s informational materials, it is repeatedly emphasized that the solution pertains exclusively to the U.S. market. The company also reserves the right to verify user location.
Expert: Fact-checking on Meta platforms is good news
Mikołaj Rogalewicz, a disinformation expert at CyberDefence24, believes that Meta’s continued collaboration with fact-checking organizations is a justified decision.
“The partnership program with fact-checking organizations, while not a perfect solution and often subject to debate, represents a tangible tool in the fight against disinformation. Importantly, Meta itself has highlighted the effectiveness of this model in previous years,” notes Mr. Rogalewicz.
The expert also acknowledges that the effectiveness of the Community Notes model may be limited, particularly in highly polarized topics or in areas requiring specialized knowledge.
Meta vs. the European Union
Why does Meta continue to work with independent fact-checkers despite earlier announcements that it would end the program?
“I believe one of the key reasons is EU regulation, including the Digital Services Act (DSA). In the European Union, very large online platforms are subject to obligations for managing systemic risks. This includes implementing adequate measures to mitigate these risks and submitting to independent audits,” emphasizes Mikołaj Rogalewicz.
In this context, it is worth recalling the signals sent by the European Commission. The EC stressed that any decision by Meta to withdraw from its collaboration with fact-checking organizations in the EU should be preceded by a risk assessment. The results of such an assessment would also need to be made public.
How fact-checking works on Meta platforms
Social media platforms such as Facebook and Instagram rely heavily on automated mechanisms to monitor posts and advertisements. For example, if a publication contains certain keywords, the content may be automatically removed. Similar mechanisms are applied to advertisements.
However, the effectiveness of these measures is not absolute. In a January 2025 statement, Meta reported that in December 2024, millions of posts were removed daily – representing less than 1% of all content published on its platforms. According to the company’s calculations, around 20% of these removals may have been erroneous.
A separate category is the fact-checking program, launched in December 2016. This initiative was a response to criticisms that Facebook’s platforms (the company rebranded as Meta in 2021) were being used to spread propaganda and information manipulation.
“We believe in giving people a voice and that we cannot become arbiters of truth ourselves. That is why we approach this issue cautiously. We have focused our efforts on the most serious cases – obvious falsehoods spread by spammers for personal gain – as well as on engaging the community and external organizations,” Meta stated at the time.
Fact-checkers are tasked with assessing the accuracy of content appearing on the platforms. Outside the U.S., this activity remains active. Meta representatives explain that fact-checkers analyze content and evaluate its credibility. They focus on information that is most likely to be false, widely shared, and potentially socially consequential. Fact-checkers operate independently from Meta and use their own tools and methodologies.
After the analysis, content is assigned to one of the following categories: False, Manipulated, Partially False, Missing Context, Satire, or True. According to Meta, content flagged as false receives appropriate labels. These labels provide users with additional context and may reduce the content’s reach on the platforms.
Good to know
Does Meta’s content-verification program make sense?
Meta’s information-verification program, conducted in collaboration with independent fact-checkers, has documented effectiveness in limiting the reach of disinformation and reducing belief in false content. Ahead of the 2024 European Parliament elections, Meta reported that between July and December 2023, more than 68 million pieces of content in the EU on Facebook and Instagram were labeled with fact-checking tags. After the labels were applied, 95% of users did not open the flagged content.
The DSA transparency report further indicates that errors in restricting the visibility of content flagged by fact-checkers accounted for around 3.15% of all visibility-reduction reports on Facebook, suggesting a relatively high degree of accuracy in the system. These findings are supported by academic research: a 2024 meta-analysis of 21 experiments found that fact-check warnings reduced belief in false information by an average of 27.6% and limited its sharing by 24.7%. The effect persisted even among users who expressed distrust of fact-checkers.
This demonstrates that content labeling and collaboration with independent fact-checkers are among the best-documented tools for curbing disinformation on social media platforms.
Key Takeaways
- Ultimately, contrary to earlier statements, Meta’s fact-checking program was discontinued only in the United States. In Poland, Meta continues to combat disinformation in partnership with Demagog and AFP Polska. Fact-checkers remain active in other EU countries as well. Experts note that this is largely the result of regulatory pressure from EU institutions, including obligations under the Digital Services Act (DSA).
- In January 2025, Meta announced it would end its collaboration with international fact-checkers, with Mark Zuckerberg likening the model to censorship.
- In April 2025, Meta in the U.S. replaced fact-checking with Community Notes, a system whose effectiveness is questioned by experts. The company indicated that the U.S. model could eventually be rolled out to other markets, including Poland.
