Martinelli Updates

STF Redefines the Liability of Digital Platforms: Impacts and New Obligations 

Compartilhar:

On June 26, 2025, the Brazilian Federal Supreme Court (STF) declared Article 19 of the Brazilian Internet Civil Framework (Law No. 12,965/2014) partially unconstitutional, establishing new parameters for the civil liability of digital platforms for third-party content. By majority vote, the Court ruled that requiring a judicial order as a prior and absolute condition for platform liability is incompatible with the Constitution, especially considering the need to ensure the effectiveness of fundamental rights in the digital environment. 

What was the rule before? 

Article 19 provided that platforms, service providers, and websites could only be held civilly liable for damages resulting from third-party content if, after a specific court decision, they failed to remove the unlawful material. 

Article 21 established exceptions, allowing civil liability based on extrajudicial notifications in cases involving non-consensual pornography or copyright infringement. 

What has changed? 

The STF’s ruling significantly altered this legal framework. The Court upheld the partial constitutionality of Article 19 but restricted its application to certain services that involve private communications, such as email providers, private messaging apps, and closed voice or video conferencing platforms. 

In these specific cases, a prior judicial order remains required to mandate content removal and establish civil liability, thereby safeguarding user privacy. 

Outside these scenarios—i.e., in open or public platforms, such as social networks and content-sharing services, civil liability may now arise without a judicial order, provided that the platform is proven to have been at fault through action or omission, under the subjective liability regime. 

This especially applies when a platform, after being duly notified through extrajudicial means, fails to remove clearly unlawful content and/or does not take reasonable and proportionate measures to remove or contain its dissemination. 

Presumption of Liability 

The STF also recognized specific cases where liability may be presumed, marking a partial shift from the purely subjective liability regime originally provided under Article 19. 

Platform liability is presumed, and the burden of proof shifts to the provider to demonstrate diligent action, in the following scenarios: 

  • Content promoted through paid boosting or sponsorship; 
  • Content disseminated through artificial means, such as bots or automated systems. 

 

In these circumstances, platforms are expected to meet heightened standards of diligence, as liability is presumed and does not depend on prior notification. Providers must demonstrate they acted promptly and effectively to remove such content. 

Liability for Systemic Failures and Serious Crimes 

The ruling further addresses platform liability in cases involving serious crimes, including: 

  • Child pornography; 
  • Incitement to suicide or self-harm; 
  • Violence against women; 
  • Anti-democratic acts; 
  • Terrorism; 
  • Hate crimes. 

  

In such cases, civil liability may arise where a systemic failure is demonstrated, meaning the absence of effective mechanisms to prevent, detect, or respond to evidently illegal content. 

Although the prevailing regime remains one of subjective liability—requiring proof of fault—the STF’s decision constitutes a significant shift, expanding the situations in which fault may be presumed, particularly where platforms repeatedly fail to act or have systems that facilitate the spread of illegal content. 

Automatic Removal of Reposted Content 

The ruling also determined that content previously found to be illegal by a final court decision must be promptly removed by any platform if reposted, regardless of a new court order. In such cases, an extrajudicial notification identifying the reposting is sufficient. 

Duty of Care and Systemic Failures 

The decision introduces, at least partially, a general duty of care regarding systemic failures. A systemic failure, attributable to an internet application provider, is characterized by the omission of adequate measures to prevent or remove the following types of illegal content, constituting a breach of the duty to act with responsibility, transparency, and diligence: 

  • Human trafficking; 
  • Sexual crimes involving vulnerable individuals, including child pornography and severe violence against minors; 
  • Crimes against women; 
  • Incitement or assistance in suicide or self-harm; 
  • Terrorist acts or related preparatory activities; 
  • Anti-democratic acts or behaviors; 
  • Incitement to discrimination based on race, color, ethnicity, religion, nationality, sexual orientation, or gender identity. 

  

Platforms may be held liable for failing to promptly make such content unavailable. 
However, the isolated presence of illegal content is not, by itself, sufficient to establish liability under this section—Article 21 of the Internet Civil Framework remains applicable in such cases. 

Requirement for Legal Representation in Brazil 

Another critical point for application providers is the obligation to establish a legal representative and local office in Brazil, and to ensure that the representative’s contact information is easily accessible on the platform’s website. 

According to the STF’s position, this representative must: 

  1. Be a legal entity headquartered in Brazil; 
  2. Be clearly identified and reachable through the platform’s official site; 
  3. Hold full powers to: 
  • Represent the company in administrative and judicial proceedings; 
  • Provide authorities with detailed information on content moderation practices, internal policies, transparency reports, algorithmic procedures, and content boosting strategies; 
  • Comply with court orders; 
  • Respond to and fulfill penalties, including fines and financial sanctions, resulting from breaches of legal or judicial obligations. 
  • This requirement is intended to ensure legal accountability and public oversight of platforms operating within the Brazilian digital ecosystem. 

 

Impact on Marketplaces 

For platforms operating as marketplaces, the new framework presents significant compliance challenges, particularly given the increased risk of liability for third-party content, especially in cases of systemic failure or lack of diligence. 

In user-driven environments where goods or services are independently offered by users, marketplaces must now observe: 

  • Stricter content and ad moderation controls; 
  • Preventive measures to screen unlawful offers; 
  • Effective channels for reporting and responding to violations. 

  

In addition to obligations under the Brazilian Consumer Protection Code, marketplaces face heightened responsibilities and must implement more robust monitoring systems. 

Transparency and Self-Regulation Duties 

The STF emphasized the duty of transparency and the importance of self-regulation by digital platforms. Examples of required measures include: 

  • User-friendly mechanisms for content reporting and appeals, even for unregistered users; 
  • Periodic transparency reports disclosing moderation practices, takedowns, and content enforcement metrics; 
  • Permanent contact channels for users and public authorities. 

  

The Court encouraged the Brazilian Congress to further regulate these obligations, and new legislation is expected to follow. 

Temporal Effects 

The effects of the decision were expressly modulated to apply only prospectively, from the date of the judgment (June 26, 2025) onward. Cases already resolved by final court decisions will remain unaffected. 

Filipe Ribeiro

Larissa Anghinetti

Vanessa Lima Nascimento

Como podemos ajudar?

Preencha o formulário e fale com a nossa equipe.

Ver Updates Relacionados

Data controllers and processors in Brazil must update contracts by August 2025 to include the standard clauses for international personal data transfers under ANPD (Brazilian [...]

On 25 June 2025, Brazil’s National Congress passed Draft Legislative Decree (PDL) No. 214/2025, suspending Federal Decrees 12,466/2025, 12,467/2025, and 12,499/2025, which increased the Tax [...]

plugins premium WordPress