A Warsaw judgment with global implications for Meta’s advertising model

A Polish court has sided with Rafał Brzoska and Omenaa Mensah in their case against Meta over fraudulent ads using their likenesses. The decision goes beyond interim relief, questioning the platform’s long-standing defence that it is merely a passive host of user content.

Rafał Brzoska, twórca i prezes InPost Group. Fot. Jakub Kuźmiński/XYZ
Rafał Brzoska has been in a legal battle with Meta since 2024. The Administrative Court’s ruling is also significant in other disputes with the platform. Photo: Jakub Kuźmiński/XYZ
Loading the Elevenlabs Text to Speech AudioNative Player...

The Court of Appeal in Warsaw has sided with Rafał Brzoska and Omenaa Mensah in their dispute with Meta Platforms, the owner of Facebook and Instagram. According to documents obtained by XYZ.pl, the court held Meta liable for advertisements appearing on its platforms, rejecting the argument that it operates as a mere “safe harbour”.

The case dates back at least to 2024. Across Meta-owned services – chiefly Facebook and Instagram – fraudulent investment ads began to proliferate. They misappropriated the likenesses of Rafał Brzoska and Omenaa Mensah.

The mechanism was repetitive and relied on manipulating users’ trust. The ads urged users to invest via various platforms. A key feature of these campaigns was the use of well-known public figures. In one version, a fabricated story claimed that Omenaa Mensah had been assaulted. In another, a deepfake featuring Rafał Brzoska purported to explain how he makes money from investments and encouraged viewers to transfer funds.

As Maciej Ślusarek, legal counsel to Rafał Brzoska and Omenaa Mensah, explains, the activity was both widespread and persistent.

“Attempts to remove the content at the platform level proved ineffective. Only legal action has brought the first breakthrough,” Mr. Ślusarek says.

Brzoska v. Meta: how the injunction came about

Following a lawsuit filed in 2024, a Warsaw court granted interim relief – an injunction barring the publication of the ads for the duration of the proceedings. The court of first instance sided with the claimants and prohibited the dissemination of content using their likenesses. Meta Platforms appealed, arguing that as a hosting provider it is not liable for content posted by users. The company invoked the Digital Services Act, maintaining that its role is limited to providing infrastructure. This argument became the axis of the legal dispute, which was ultimately referred to the appellate court.

The decision of the Court of Appeal in Warsaw, issued on 27 March, proved a breakthrough. Counsel for the claimants, Maciej Ślusarek, requested the court’s written reasoning, which has now been delivered.

“The court partially upheld the injunction – in respect of Omenaa Mensah – while at the same time setting it aside in the part concerning Rafał Brzoska, finding it had been framed too broadly,” Mr. Ślusarek explains.

Court finds Meta liable

The crucial element, however, lies in the court’s reasoning, which addresses the essence of the platform’s liability.

“The court confirmed that Meta Platforms bears responsibility for the content of the advertisements – because it is an active participant and facilitates their dissemination,” says Maciej Ślusarek.

As he adds, Meta’s role goes beyond merely hosting ads. The platform provides tools for targeting, distribution and campaign optimization. The ruling challenges the long-standing defense line of digital platforms, which have argued they are passive intermediaries. In the court’s view, Meta cannot rely on the liability exemption available to hosting providers; its role in the advertising process is, in fact, active.

“In that sense, the court has determined that Meta cannot invoke Articles 6 and 7 of the Digital Services Act, which exclude hosting providers from liability for advertising content,” the lawyer stresses.

In the United States, “safe harbor” refers to provisions shielding technology platforms from liability for user-generated content – most notably Section 230 of the 1996 legislation. The concept is also used in Europe.

Good to know

Revolut report highlights the scale of fraud on social media

The scale of scams on social platforms has reached systemic proportions. The vast majority of so-called authorized fraud cases – where users themselves transfer money to criminals – originate on social media or messaging apps, according to a report by Revolut.

Platforms owned by Meta Platforms remain the largest source of such crimes, accounting for 44% of all scam cases. Facebook alone generates more than 21% of fraud incidents, while WhatsApp contributes a further 17%. At the same time, Telegram is gaining ground rapidly, already responsible for more than 20% of cases and, in many countries, becoming the primary channel for criminal activity.

A key factor is the scale of users’ exposure to fraudulent content. According to data cited in the report, European internet users saw nearly one trillion scam advertisements in 2025. The average user encounters around 190 such messages each month. This creates an environment in which scams become both pervasive and increasingly difficult to distinguish from legitimate offers.

The most common category remains so-called purchase scams – fraudulent sales offers – which account for roughly 57% of cases. However, job-related scams are growing even faster and now make up 22% of all incidents.

The conclusions are stark: social media are no longer merely communication channels but a critical infrastructure underpinning a global fraud market. Without greater accountability for content and advertising, the scale of the problem will continue to grow.

Court order strikes at Meta’s global strategy

XYZ’s editorial team has obtained the court’s written reasoning. It focuses primarily on the permissible scope of interim relief and the proportionality of the obligations imposed on the defendant. The court’s starting point was that the claimants – Omenaa Mensah and Rafał Brzoska – had plausibly demonstrated an infringement of personal rights through the dissemination of false content using their likeness.

Read also: Breaking the lock-in: Poland takes aim at tech monopolies

“In the circumstances of this case, it is fairly obvious that the publication of the advertisements and other promoted materials identified in the claim resulted in an infringement of the personal rights of both claimants, including their image and reputation. The final catalogue of infringed interests will be determined by the court adjudicating the case in its final judgment. This issue is not in dispute at the appellate stage (the appellant has raised no objections in this regard),” the court wrote.

The court did not question the merits of granting interim protection. It stressed that materials suggesting a “purported death”, “assault” or “detention” constitute a serious intrusion into personal rights and may produce real social consequences.

“In the view of the Court of Appeal, the claimants have plausibly demonstrated that, despite taking steps to report illegal content, it was only partially removed or remained online. The content in question violates internal community standards and the terms of service of platforms administered by the defendant. The defendant has acknowledged that, using the tools and systems at its disposal – often automated – it is not able to identify all such content. As a consequence of the reports made and the failure to remove all content, it must be considered plausible that the defendant had actual knowledge of the existence of equivalent content,” the court stated.

Meta not always at fault

At the same time, the Court of Appeal stressed that interim relief cannot impose obligations on the defendant that are excessively broad or impracticable. In its reasoning, the court highlighted the need to strike a balance between protecting personal rights and ensuring the proportionality of measures that interfere with a company’s operations.

The key argument concerned the overly general and potentially imprecise scope of some of the obligations imposed. The court found that certain requirements could have led to a situation in which the defendant would be compelled to act beyond specific, identified instances of infringement – effectively forcing it to monitor and filter broad categories of content, including, for example, ordinary posts by businesses.

A technology company and liability for content

Such an obligation, the court held, should not be imposed at the interim stage, which is temporary in nature and cannot prejudge ultimate liability. Against this backdrop, the court partially revised the earlier ruling, removing elements of the injunction it deemed excessive. These changes relate to Rafał Brzoska.

The court emphasized that protective measures should be “closely linked to specific infringements” and capable of being implemented in a clear and unambiguous manner, without requiring the defendant to make broad, independent assessments of which content falls within the scope of the prohibition.

At the same time, the Court of Appeal upheld the part of the injunction relating to the forms of infringement described in the case concerning Omenaa Mensah. It found that, in this respect, the obligation to remove or block content is justified and proportionate, as it pertains to clearly defined communications infringing personal rights.

Court leans on the DSA

The court’s reasoning on Meta’s liability for advertising is particularly noteworthy.

“In the view of the Court of Appeal, the claimants are right as to the nature of the defendant’s role in the publication of advertisements and sponsored posts. First, it is the defendant’s decision whether such content is placed on the platform. The defendant receives remuneration and offers advertisers support and tools (algorithms) that increase a campaign’s reach, target the appropriate audience and enhance the expected returns for those paying for the service. In this context, the defendant’s role should not be regarded as passive or reduced to purely automated activity. The adopted business model cannot be without consequence for assessing the defendant’s status as an intermediary service provider,” the court wrote in its reasoning.

It is the defendant that determines whether a given advertisement appears within the digital environment it administers, verifying compliance with its own rules.

“This means that the defendant has access to the content of the message even before publication – at the stage of accepting the order and collecting the relevant fee. The scope of the activities undertaken and the tools used in this regard were set out in the response to the appeal, in a manner which, in light of the requirement to establish a prima facie case, may be regarded as sufficient. It is therefore reasonable to conclude that the defendant’s role in the advertising process is conscious, active and directed at generating financial gain – both for itself and for advertisers,” the Court of Appeal set out.

Taken together, the reasoning signals a narrower reading of the liability shield under the Digital Services Act, with direct implications for Meta Platforms’s operating model in Europe.

Who is responsible for content on Facebook

The court’s reasoning also draws a clear distinction between liability for specific content and a general obligation to prevent such content from appearing in the future. It stresses that interim proceedings are not the appropriate instrument for imposing broad, systemic preventive duties – especially where these would require complex technical and organizational measures.

“At this stage, the defendant’s argument – advanced by a leading global provider of intermediary services – that it lacks the technological capability raises doubts, particularly given that the defendant itself, as it has acknowledged, uses artificial intelligence to improve the operation of the platforms it administers,” the court wrote.

Read also: When automation fails: Meta, scams, and the cost to Polish businesses

The overall line of reasoning leads to the conclusion that the claimants should be afforded protection, but in a precise manner and limited to what is strictly necessary. The court underscores the temporary nature of interim relief and the need to refrain from decisions that could, in practice, prejudge the outcome of the main proceedings or impose an excessive burden on one party before the case is resolved.

The judgment thus refines the boundaries of platform liability for content on services such as Facebook, with broader implications for Meta Platforms’s compliance approach in Europe.

Consequences of the ruling

Although formally this is an interim injunction, its significance goes beyond a temporary procedural stage. The court addressed a fundamental question of liability for advertising content. According to counsel, this is the first ruling of its kind in cases involving technology platforms in Europe.

“This is a landmark decision across Europe,” says Maciej Ślusarek.

At the same time, the financial dimension of the case is becoming increasingly important. The legal representative of Rafał Brzoska and Omenaa Mensah has announced plans to seek disclosure from Meta Platforms of revenues generated from advertisements using the claimants’ likenesses. These figures will be used to determine the level of damages sought. The potential sums involved may be substantial.

The lawyer refers to materials reported by Reuters, which are said to indicate the global scale of the phenomenon.

Billions in revenue for Meta

“Approximately 10–12% of Meta’s global revenue. We are talking about billions of dollars,” says the lawyer.

If these figures are confirmed during the proceedings, they could become a key argument in the case, pointing to the systemic nature of the issue.

Another important aspect of the dispute is the effectiveness of measures taken by the platform. Meta Platforms has stated that it introduced mechanisms designed to limit the misuse of personal likenesses in advertising. However, according to the claimants, these measures have not produced tangible results.

“Despite the introduction of such a system, scam advertisements featuring Mr. Rafał continued to appear and still appear,” stresses Maciej Ślusarek.

In his view, the lack of effective action may stem from economic incentives.

“Meta is most likely not interested in stopping this type of advertising, because it generates enormous revenues,” the lawyer adds.

The case in Ireland

The dispute also has an international dimension. Parallel proceedings are underway before the Irish Data Protection Commission – the authority responsible for Meta’s European headquarters. It is there that a decision may be taken on potential administrative sanctions for breaches of data protection rules, including unlawful use of personal likenesses. The fines could reach up to 10% of the company’s global revenue.

After receiving the written reasoning for the interim order, the case is entering its next phase. Legal representatives have announced they will file a new, more narrowly tailored application for interim relief concerning the likeness of Rafał Brzoska and continue the main proceedings.

“We will of course file a new application, more precisely defining the deepfake content, in line with what the Court of Appeal indicated,” says Maciej Ślusarek.

At the next stage, the court will decide on possible apologies, damages, and the final scope of the platform’s liability.

Rafał Brzoska’s reaction

Rafał Brzoska welcomed the ruling.

“This is not the end of this fight. This is the moment when the model of impunity for big platforms begins to crack!” he wrote in a post on LinkedIn.

We asked Meta Platforms’s Polish press office for comment. We had not received a response by the time of publication.

Key Takeaways

  1. The dispute between Rafał Brzoska and Omenaa Mensah and Meta exposes the scale and operating mechanisms of investment scams on social media – particularly those based on the manipulation of well-known individuals’ likenesses. The court confirmed that the advertisements in question infringed the claimants’ personal rights. Their mass distribution, combined with the ineffectiveness of removal requests submitted to the platform, justified legal intervention. The case illustrates that existing content moderation tools are insufficient in the face of the scale and repetitive nature of such abuse.
  2. A key element of the ruling is the recognition of Meta’s active role in the advertising delivery process, which undermines its defense based on the status of a passive intermediary. The court noted that the platform does not merely provide infrastructure but also participates in targeting and optimization of advertising campaigns, while deriving financial benefit from these activities. As a result, Meta cannot rely on the liability exemptions provided under the Digital Services Act. This may have broader implications for the entire technology sector in Europe.
  3. At the same time, the court emphasized the need for proportionality in legal remedies, narrowing the scope of interim relief against Rafał Brzoska and maintaining a more limited version in relation to Omenaa Mensah. The ruling does not impose a general monitoring obligation on Meta but requires responses to specific infringements. The case is potentially precedent-setting and may influence future regulation and platform liability for advertising content—particularly in the context of the growing scale of fraud and the potentially multi-billion-dollar revenues generated in this segment.


We covered this story because we considered it important and newsworthy. For full transparency, we note that the RiO fund, owned by Omenaa Mensah and Rafał Brzoska – CEO and shareholder of InPost – is an investor in XYZ.