
Disclaimer: This is an unofficial English translation provided for reference only. It does not replace the need to read and analyze the original documents. Please consult the official Portuguese versions for all legal and official purposes.
—————————————-
ECA DIGITAL (https://www.planalto.gov.br/ccivil_03/_ato2023-2026/2025/Lei/L15211.htm)
Presidency of the Republic Chief of Staff’s Office Special Secretariat for Legal Affairs
LAW NO. 15,211, OF SEPTEMBER 17, 2025
Provides for the protection of children and adolescents in digital environments (Digital Statute of the Child and Adolescent).
THE PRESIDENT OF THE REPUBLIC I make it known that the National Congress decrees and I sanction the following Law:
CHAPTER I
PRELIMINARY PROVISIONS
Art. 1. This Law provides for the protection of children and adolescents in digital environments and applies to any information technology product or service directed at children and adolescents in the country or likely to be accessed by them, regardless of its location, development, manufacturing, offering, commercialization, and operation.
Sole Paragraph. For the purposes of this Law, the following situations are considered likely to be accessed by children and adolescents:
I – a sufficient probability of use and attractiveness of the information technology product or service by children and adolescents;
II – considerable ease of access to and use of the information technology product or service by children and adolescents; and
III – a significant degree of risk to the privacy, security, or biopsychosocial development of children and adolescents, especially in the case of products or services intended to allow social interaction and the large-scale sharing of information among users in a digital environment.
Art. 2. For the purposes of this Law, the following definitions apply:
I – information technology product or service: a product or service provided at a distance, by electronic means, and delivered upon individual request, such as internet applications, computer programs, software, terminal operating systems, internet application stores, and electronic games or similar connected to the internet or another communications network;
II – child monitoring product or service: an information technology product or service intended for the monitoring, by parents or legal guardians, of actions performed by children and adolescents in digital environments, through the recording or transmission of images, sounds, location information, activity, or other data;
III – social network: an internet application whose main purpose is the sharing and dissemination, by users, of opinions and information conveyed through texts or image, sound, or audiovisual files, on a single platform, through connected or articulately accessible accounts, allowing connection between users;
IV – reward box (loot box): a feature available in certain electronic games that allows the player to acquire, upon payment, consumable virtual items or random advantages, redeemable by the player or user, without prior knowledge of their content or guarantee of their effective utility;
V – profiling: any form of automated or non-automated processing of personal data to evaluate certain aspects of a natural person, with the objective of classifying them into a group or profile in order to make inferences about their behavior, economic situation, health, personal preferences, interests, consumer desires, geographical location, movements, political positions, or other similar characteristics;
VI – internet application store: an internet application that distributes and facilitates the download, for terminal users, of internet applications made available or accessible through its platform;
VII – operating system: system software that controls the basic functions of a hardware or software and allows internet applications, computer programs, applications, or other software to be executed through it;
VIII – parental supervision mechanism: a set of settings, tools, and technological safeguards integrated into information technology products or services directed at children and adolescents or likely to be accessed by them, which enable parents or legal guardians to supervise, limit, and manage the use of the service, the accessed content, and the processing of personal data performed;
IX – service with editorial control: an internet application whose main purpose is the provision of previously selected content, without the use of automated selection means, by a responsible economic agent;
X – autonomous administrative authority for the protection of the rights of children and adolescents in the digital environment: a public administration entity created by law, responsible for ensuring the application of this Law and overseeing its compliance throughout the national territory, and for issuing regulations and procedures for its execution, which must observe in its decision-making process the norms provided for in Chapter I of Law No. 13,848, of June 25, 2019;
XI – monetization: direct or indirect remuneration of an internet application user for the publication, posting, display, provision, transmission, dissemination, or distribution of content, including revenue from views, subscriptions, donations, sponsorships, advertising, or the sale of linked products and services; and
XII – boosting: the artificial expansion of the reach, visibility, or prioritization of content through pecuniary payment or a value estimable in money.
§1. The concepts of child and adolescent contained in Art. 2 of Law No. 8,069, of July 13, 1990 (Statute of the Child and Adolescent), and those of internet, internet applications, and terminal contained in Art. 5 of Law No. 12,965, of April 23, 2014 (Civil Framework for the Internet), apply to this Law.
§2. For the purposes of this Law, functionalities essential for the operation of the internet, such as protocols and open and common technical standards that allow for the interconnection between the computer networks that make up the internet, are not considered information technology products or services.
Art. 3. Information technology products or services directed at children and adolescents or likely to be accessed by them must ensure the priority protection of these users, have their best interest as a parameter, and feature adequate and proportional measures to ensure a high level of privacy, data protection, and security, under the terms defined in Laws No. 8,069, of July 13, 1990 (Statute of the Child and Adolescent), and No. 13,709, of August 14, 2018 (General Personal Data Protection Law).
Sole Paragraph. The child and the adolescent have the right to be educated, guided, and accompanied by their parents or legal guardians regarding the use of the internet and their digital experience, and it is incumbent upon the latter to exercise active and continuous care, through the use of parental supervision tools appropriate to the age and stage of development of the child and adolescent.
CHAPTER II
ON INFORMATION TECHNOLOGY PRODUCTS AND SERVICES
Art. 4. The use of information technology products or services by children and adolescents is based on the following fundamentals:
I – the guarantee of their full protection;
II – the absolute prevalence of their interests;
III – their peculiar condition as a person in biopsychosocial development;
IV – security against intimidation, exploitation, abuse, threats, and other forms of violence;
V – respect for the autonomy and progressive development of the individual;
VI – protection against commercial exploitation;
VII – observance of the principles established in Law No. 13,146, of July 6, 2015 (Statute of Persons with Disabilities);
VIII – the promotion of digital education, focusing on the development of citizenship and critical thinking for the safe and responsible use of technology; and
IX – transparency and accountability in the processing of personal data of children and adolescents.
Art. 5. Information technology products or services directed at children and adolescents or likely to be accessed by them shall observe the duties of prevention, protection, information, and security provided for in this Chapter and in Laws No. 8,078, of September 11, 1990 (Consumer Defense Code), and No. 8,069, of July 13, 1990 (Statute of the Child and Adolescent), in accordance with the principle of the best interest of the child and adolescent and their full, special, and priority protection.
§1. The providers of the information technology products or services referred to in the caput of this article shall adopt adequate technical measures, including widely recognized security mechanisms, that enable the family and legal guardians to prevent improper access and use by children and adolescents.
§2. For the purposes of this Law, the protection of their privacy, security, mental and physical health, access to information, freedom of participation in society, meaningful access to digital technologies, and well-being are considered as an expression of the best interest of the child and adolescent.
§3. The autonomous administrative authority for the protection of the rights of children and adolescents in the digital environment may issue recommendations and guidelines concerning relevant practices for the fulfillment of the obligations provided for in this Law, considering regulatory asymmetries, the functionalities and risk level of each product or service, as well as technological evolution and applicable technical standards.
Art. 6. Providers of information technology products or services directed at children and adolescents or likely to be accessed by them shall take reasonable measures from the design stage and throughout the operation of their applications, with the objective of preventing and mitigating risks of access, exposure, recommendation, or facilitation of contact with the following content, products, or practices:
I – sexual exploitation and abuse;
II – physical violence, systematic virtual intimidation (cyberbullying), and harassment;
III – induction, incitement, instigation, or aid, through instructions or guidance, to practices or behaviors that lead to harm to the physical or mental health of children and adolescents, such as physical violence or psychological harassment of other children and adolescents, use of substances that cause chemical or psychological dependence, self-diagnosis and self-medication, self-harm, and suicide;
IV – promotion and commercialization of games of chance, fixed-odds betting, lotteries, tobacco products, alcoholic beverages, narcotics, or products whose sale is prohibited to children and adolescents;
V – predatory, unfair, or misleading advertising practices or other practices known to cause financial harm to children and adolescents; and
VI – pornographic content.
§1. The provisions of this article do not exempt parents and legal guardians, persons who financially benefit from the production or public distribution of any visual representation of a child or adolescent, and administrative, judicial, and police authorities from acting to prevent their exposure to the violating situations provided for in the caput of this article.
§2. Among the prevention measures provided for in the caput of this article are included clear, effective, and adequate policies under Brazilian law for the prevention of systematic virtual intimidation and other forms of harassment on the internet, with adequate support mechanisms for victims, as well as the development and provision of educational awareness programs directed at children, adolescents, parents, educators, employees, and support teams about the risks and ways to prevent and confront these practices, under the terms of the regulation.
Art. 7. Providers of information technology products or services directed at children and adolescents or likely to be accessed by them shall, from the design of their products and services, ensure, by default, the configuration in the most protective model available regarding privacy and personal data protection, considering the autonomy and progressive development of the individual and justifying the best interest of the child and adolescent.
§1. The product or service referred to in the caput of this article shall, by default, operate with the highest degree of privacy and personal data protection, observing that it will be mandatory to provide clear, accessible, and adequate information so that the child or adolescent and their guardians can make informed choices regarding the eventual adoption of less protective settings.
§2. The providers mentioned in the caput of this article shall refrain from processing the personal data of children and adolescents in a way that causes, facilitates, or contributes to the violation of their privacy or any other rights assured to them by law, observing the principles provided for in Art. 6 of Law No. 13,709, of August 14, 2018 (General Personal Data Protection Law), and the best interest of the child and adolescent.
Art. 8. Providers of information technology products or services directed at children and adolescents or likely to be accessed by them shall:
I – perform risk management of their resources, functionalities, and systems and of their impacts directed at the safety and health of children and adolescents;
II – conduct an assessment of the content made available to children and adolescents according to the age group, so that it is compatible with the respective age rating;
III – offer systems and processes designed to prevent children and adolescents from encountering, through the product or service, illegal and pornographic content, as well as other content manifestly inappropriate for their age group, in accordance with age rating norms and applicable legislation;
IV – develop from the design stage and adopt by default settings that prevent the compulsive use of products or services by children and adolescents; and
V – extensively inform all users about the recommended age group for the product or service at the moment of access, as established by the age rating policy.
CHAPTER III
ON THE PROHIBITION OF ACCESS BY CHILDREN AND ADOLESCENTS TO IMPROPER, INADEQUATE, OR LEGALLY PROHIBITED CONTENT AND SERVICES
Art. 9. Providers of information technology products or services that make available content, a product, or a service whose offer or access is improper, inadequate, or prohibited for persons under 18 (eighteen) years of age shall adopt effective measures to prevent its access by children and adolescents within the scope of their services and products.
§1. To give effect to the provisions of the caput, reliable age verification mechanisms shall be adopted for each user access to the content, product, or service referred to in the caput of this article, with self-declaration being prohibited.
§2. For the purposes of this Law, information technology products, services, or content that contain pornographic material, or any other material prohibited by current legislation, are considered improper or inadequate for children and adolescents.
§3. Internet application providers that make pornographic content available shall prevent the creation of accounts or profiles by children and adolescents within their services.
CHAPTER IV
ON AGE ASSESSMENT MECHANISMS
Art. 10. Providers of information technology products or services directed at children and adolescents or likely to be accessed by them shall adopt mechanisms to provide age-appropriate experiences, under the terms of this Chapter, respecting the progressive autonomy and the diversity of Brazilian socioeconomic contexts.
Art. 11. The public authorities may act as a regulator, certifier, or promoter of technical age verification solutions, observing the limits of legality, privacy protection, and fundamental rights provided by law.
Sole Paragraph. The action of the public authorities provided for in the caput of this article shall ensure social participation, through public consultation and other mechanisms of social participation, in order to guarantee transparency in the regulatory process.
Art. 12. Providers of internet application stores and terminal operating systems shall:
I – take proportional, auditable, and technically secure measures to assess the age or age range of users, observing the principles provided for in Art. 6 of Law No. 13,709, of August 14, 2018 (General Personal Data Protection Law);
II – allow parents or legal guardians to configure voluntary parental supervision mechanisms and to actively supervise the access of children and adolescents to applications and content; and
III – enable, through a secure Application Programming Interface (API) guided by privacy by default, the provision of an age signal to internet application providers, exclusively for the purpose of complying with this Law and with adequate technical safeguards.
§1. The provision of an age signal through APIs shall observe the principle of data minimization, with any continuous, automated, and unrestricted sharing of personal data of children and adolescents being prohibited.
§2. Authorization for the download of applications by children and adolescents shall depend on the free and informed consent of parents or legal guardians, provided under the terms of current legislation, respecting progressive autonomy, with the presumption of authorization being prohibited in the event of a lack of manifestation from the parents or legal guardians.
§3. An act of the Executive Branch shall regulate the minimum requirements for transparency, security, and interoperability for the age assessment and parental supervision mechanisms adopted by operating systems and application stores.
Art. 13. Data collected for the age verification of children and adolescents may be used solely for that purpose, with its processing for any other purpose being prohibited.
Art. 14. Providers of information technology products or services directed at children and adolescents or likely to be accessed by them shall adopt technical and organizational measures to ensure the receipt of the age information referred to in Art. 12 of this Law.
Sole Paragraph. Regardless of the measures adopted by operating systems and application stores, the providers mentioned in the caput of this article shall implement their own mechanisms to prevent undue access by children and adolescents to content inappropriate for their age group, under the terms of § 1 of Art. 5 of this Law.
Art. 15. Compliance with the obligations set forth in this Chapter does not exempt the other agents in the digital chain from their legal responsibilities, it being incumbent upon all involved to jointly ensure the full protection of children and adolescents.
CHAPTER V
ON PARENTAL SUPERVISION
Art. 16. Providers of information technology products or services directed at children and adolescents or likely to be accessed by them shall make available to parents, legal guardians, children, and adolescents, with access independent of the purchase of the product, information about the risks and the security measures adopted for this public, including privacy and data protection, in accordance with the provisions of Art. 14 of Law No. 13,709, of August 14, 2018 (General Personal Data Protection Law).
Sole Paragraph. In the event of processing data of children and adolescents, especially when carried out for purposes other than those strictly necessary for the operation of the product or service, the controller referred to in item VI of Art. 5 of Law No. 13,709, of August 14, 2018 (General Personal Data Protection Law), shall:
I – map the risks and make efforts to mitigate them; and
II – prepare an impact, monitoring, and evaluation report on personal data protection, to be shared upon request from the autonomous administrative authority for the protection of the rights of children and adolescents in the digital environment, in the form of a regulation.
Art. 17. Providers of information technology products or services directed at children and adolescents or likely to be accessed by them shall:
I – provide accessible and easy-to-use settings and tools that support parental supervision, considering the available technology and the nature and purpose of the product or service;
II – provide, in an easily accessible location, information to parents or legal guardians about the existing tools for exercising parental supervision;
III – display a clear and visible notice when parental supervision tools are in effect and about which settings or controls have been applied; and
IV – offer functionalities that allow for limiting and monitoring the time of use of the product or service.
§1. The autonomous administrative authority for the protection of the rights of children and adolescents in the digital environment shall establish, by regulation, minimum guidelines and standards on parental supervision mechanisms to be observed by providers.
§2. The development and use of parental supervision mechanisms shall be guided by the best interest of the child and adolescent, considering the progressive development of their capacities.
§3. Providers of information technology products or services directed at children and adolescents or likely to be accessed by them may submit parental supervision mechanisms for the consideration of the autonomous administrative authority for the protection of the rights of children and adolescents in the digital environment, observing that this shall not be a prerequisite for the use of these mechanisms or for making products or services available to the public, under the terms of the regulation.
§4. The default settings of parental supervision tools shall adopt the highest level of protection available, ensuring, at a minimum:
I – restriction of communication with children and adolescents by unauthorized users;
II – limitation of features designed to artificially increase, sustain, or extend the use of the product or service by the child or adolescent, such as automatic media playback, rewards for time of use, notifications, and other features that may result in excessive use of the product or service by a child or adolescent;
III – provision of tools for monitoring the proper and healthy use of the product or service;
IV – use of interfaces that allow for the immediate visualization and limitation of the time of use of the product or service;
V – control over personalized recommendation systems, including an option to disable them;
VI – restriction on the sharing of geolocation and provision of a prior and clear notice about its tracking;
VII – promotion of digital media literacy regarding the safe use of information technology products or services;
VIII – regular review of artificial intelligence tools, with the participation of specialists and competent bodies, based on technical criteria that ensure their safety and suitability for use by children and adolescents, guaranteeing the possibility of disabling non-essential functionalities for the basic operation of the systems;
IX – provision, whenever technically feasible, of resources or connections to emotional support and well-being services, with age-appropriate content and evidence-based guidance, especially in cases of interactions with identified psychosocial risks.
Art. 18. Parental supervision tools shall allow parents and legal guardians to:
I – view, configure, and manage the account and privacy options of the child or adolescent;
II – restrict purchases and financial transactions;
III – identify the profiles of adults with whom the child or adolescent communicates;
IV – access consolidated metrics of the total time of use of the product or service;
V – activate or deactivate safeguards through accessible and adequate controls;
VI – have access to information and control options in the Portuguese language.
§1. Information about parental supervision tools shall be made available in a clear and appropriate manner for different ages, capacities, and developmental needs, without encouraging the deactivation or weakening of the safeguards.
§2. It is prohibited for the provider to design, modify, or manipulate interfaces with the objective or effect of compromising the autonomy, decision-making, or choice of the user, especially if it results in the weakening of parental supervision tools or safeguards.
CHAPTER VI
ON CHILD MONITORING PRODUCTS
Art. 19. Child monitoring products or services shall contain current information and communication technology mechanisms and solutions to ensure the inviolability of the images, sounds, and other information captured, stored, and transmitted to parents or legal guardians.
§1. The products or services shall contain mechanisms that inform children and adolescents, in appropriate language, about the performance of monitoring.
§2. The development and use of child monitoring mechanisms shall be guided by the best interest of the child and adolescent and by the full development of their capacities.
CHAPTER VII
ON ELECTRONIC GAMES
Art. 20. Reward boxes (loot boxes) offered in electronic games directed at children and adolescents or likely to be accessed by them are prohibited, under the terms of the respective age rating.
Art. 21. Electronic games directed at children and adolescents or likely to be accessed by them that include interaction functionalities between users through text, audio, or video messages or content exchange, synchronously or asynchronously, shall fully observe the safeguards provided for in Art. 16 of Law No. 14,852, of May 3, 2024, especially with regard to content moderation, protection against harmful contacts, and parental action on communication mechanisms.
Sole Paragraph. The games referred to in the caput of this article shall, by default, limit interaction functionalities to users in order to ensure the consent of parents or legal guardians.
CHAPTER VIII
ON ADVERTISING IN DIGITAL MEDIA
Art. 22. In addition to the other provisions of this Law, the use of profiling techniques for directing commercial advertising to children and adolescents is prohibited, as well as the use of emotional analysis, augmented reality, extended reality, and virtual reality for this purpose.
Art. 23. Internet application providers are prohibited from monetizing and boosting content that portrays children and adolescents in an eroticized or sexually suggestive manner or in a context proper to the adult sexual universe.
CHAPTER IX
ON SOCIAL NETWORKS
Art. 24. Within the scope of their services, providers of products or services directed at children and adolescents or likely to be accessed by them shall ensure that users or accounts of children and adolescents up to 16 (sixteen) years of age are linked to the user or account of one of their legal guardians.
§1. If their services are improper or inadequate for children and adolescents, social network providers shall adopt adequate and proportional measures to:
I – inform all users in a clear, prominent, and accessible manner that their services are not appropriate;
II – monitor and restrict, within the limits of their technical capabilities, the display of content that has the evident objective of attracting children and adolescents;
III – continuously improve their age verification mechanisms to identify accounts operated by children and adolescents.
§2. The degree of effectiveness and the progress of the mechanisms referred to in item III of § 1 of this article shall be evaluated by the autonomous administrative authority for the protection of the rights of children and adolescents in the digital environment, under the terms of specific regulation.
§3. Social network providers may require the persons responsible for accounts with well-founded indications of being operated by children and adolescents to confirm their identification, including through complementary verification methods, observing that the data collected shall be used exclusively for age verification.
§4. In the face of well-founded indications that the account is operated by a child or adolescent in non-compliance with the minimum age requirements provided for in the legislation, social network providers shall suspend the user’s access and ensure the establishment of a swift and accessible procedure in which the legal guardian can file an appeal and prove the age by an appropriate means, under the terms of the regulation.
§5. In the absence of a user or account of the legal guardians, providers shall prohibit the possibility of changing the account’s parental supervision settings to a lower level of protection in relation to the standard established in Arts. 3 and 7 of this Law.
Art. 25. Social network providers shall provide for specific rules for the processing of data of children and adolescents, defined in a concrete and documented manner and based on their best interest.
Art. 26. The creation of behavioral profiles of child and adolescent users from the collection and processing of their personal data, including that obtained in age verification processes, as well as group and collective data, for the purpose of directing commercial advertising, is prohibited.
CHAPTER X
ON THE PREVENTION AND COMBATING OF GRAVE VIOLATIONS AGAINST CHILDREN AND ADOLESCENTS IN THE DIGITAL ENVIRONMENT
Art. 27. Providers of information technology products or services available in the national territory shall remove and report content of apparent exploitation, sexual abuse, kidnapping, and enticement detected in their products or services, directly or indirectly, to the competent national and international authorities, in the form of a regulation.
§1. Reports notifying content of exploitation, sexual abuse, kidnapping, and enticement of children and adolescents shall be sent to the competent authority, observing the requirements and deadlines established in the regulation.
§2. Providers shall retain, for the period established in Art. 15 of Law No. 12,965, of April 23, 2014 (Civil Framework for the Internet), the following data associated with a report of child or adolescent sexual exploitation and abuse content:
I – content generated, uploaded, or shared by any user mentioned in the report and metadata related to said content;
II – data of the user responsible for the content and metadata related to them.
§3. The period referred to in § 2 of this article may be longer than that established in Art. 15 of Law No. 12,965, of April 23, 2014 (Civil Framework for the Internet), provided that a request is made in the form of § 2 of Art. 15 of said Law.
CHAPTER XI
ON THE REPORTING OF VIOLATIONS OF THE RIGHTS OF CHILDREN AND ADOLESCENTS
Art. 28. Providers of information technology products or services directed at children and adolescents or likely to be accessed by them shall make available to users mechanisms for notifications regarding violations of the rights of children and adolescents.
Sole Paragraph. Upon being notified of violations of the rights of children and adolescents within the scope of their services, providers shall, when applicable, officially inform the competent authorities for the initiation of an investigation, under the terms of the regulation.
Art. 29. To meet the principle of full protection, it is the duty of providers of information technology products or services directed at children and adolescents or likely to be accessed by them to proceed with the removal of content that violates the rights of children and adolescents as soon as they are informed of the offensive nature of the publication by the victim, their representatives, the Public Prosecutor’s Office, or representative entities for the defense of the rights of children and adolescents, regardless of a court order.
§1. The content referred to in Art. 6 of this Law shall be considered a violation of the rights of children and adolescents, under the terms of the age rating.
§2. The notification provided for in the caput of this article must contain, under penalty of nullity, elements that allow for the specific technical identification of the content pointed out as violating the rights of children and adolescents and of the author of the notification, with anonymous reporting being prohibited.
§3. Application providers shall make public and easily accessible the mechanism by which the notification provided for in the caput of this article must be sent by the notifier.
§4. Journalistic content and content subject to editorial control shall not be subject to the removal procedure referred to in the caput of this article.
Art. 30. In the content removal procedure referred to in Art. 29 of this Law, providers of products or services shall observe the right to contest the decision, ensuring the user who had published the content:
I – notification of the removal;
II – the reason and justification for the removal, informing whether the identification of the removed content resulted from human or automated analysis;
III – the possibility for the user to appeal against the measure;
IV – easy access to the appeal mechanism; and
V – the definition of procedural deadlines for filing an appeal and for a response to the appeal.
CHAPTER XII
ON TRANSPARENCY AND ACCOUNTABILITY
Art. 31. Internet application providers directed at children and adolescents or likely to be accessed by them that have more than 1,000,000 (one million) registered users in this age group, with an internet connection in the national territory, shall prepare semi-annual reports, in the Portuguese language, to be published on the provider’s website, which shall contain:
I – the available channels for receiving reports and the investigation systems and processes;
II – the number of reports received;
III – the amount of content or account moderation, by type;
IV – the measures adopted for the identification of children’s accounts on social networks, as provided in § 3 of Art. 24, and of illicit acts, as provided in Art. 27 of this Law;
V – technical improvements for the protection of personal data and the privacy of children and adolescents;
VI – technical improvements to ascertain parental consent as provided in § 1 of Art. 14 of Law No. 13,709, of August 14, 2018 (General Personal Data Protection Law); and
VII – details of the methods used and the presentation of the results of the impact assessments, identification, and management of risks to the safety and health of children and adolescents.
Sole Paragraph. Internet application providers shall enable, free of charge, access to data necessary for conducting research on the impacts of their products and services on the rights of children and adolescents and their best interests, by academic, scientific, technological, innovation, or journalistic institutions, according to criteria and requirements defined in the regulation, with the use of this data for any commercial purposes being prohibited and compliance with the principles of purpose, necessity, security, and confidentiality of the information being ensured.
CHAPTER XIII
ON THE ABUSIVE USE OF REPORTING INSTRUMENTS
Art. 32. Internet application providers shall adopt effective mechanisms for the identification of abusive use of the reporting instruments provided for in this Law, with the objective of curbing their improper use for purposes of censorship, persecution, or other illicit practices.
Art. 33. Internet application providers directed at children and adolescents or likely to be accessed by them shall make available to users clear and accessible information about the circumstances of improper use of the reporting instruments, as well as about the applicable sanctions, observing due internal process.
§1. Sanctioning measures constitute, among others that prove to be adequate, proportional, and necessary to the gravity of the conduct:
I – temporary suspension of the infringing user’s account;
II – cancellation of the account in cases of recidivism or serious abuse; and
III – communication to the competent authorities, when there are indications of a criminal offense or violation of rights.
§2. Internet application providers shall establish and disclose objective and transparent procedures for the identification of abusive use of reporting instruments and for the application of the sanctions provided for in § 1 of this article, which shall contain, at a minimum:
I – definition of technical and objective criteria for the characterization of abuse;
II – notification to the user about the initiation of a procedure to investigate abuse and, if applicable, about the application of sanctions;
III – the possibility for the sanctioned user to file an appeal; and
IV – definition of procedural deadlines for filing an appeal and for a reasoned response from the provider.
§3. Internet application providers shall maintain detailed records of identified cases of abusive use and of the applied sanctions, with the objective of monitoring the effectiveness of the adopted mechanisms and promoting the continuous improvement of internal procedures, according to criteria and requirements defined in the regulation.
CHAPTER XIV
ON GOVERNANCE
Art. 34. The autonomous administrative authority for the protection of the rights of children and adolescents in the digital environment shall be responsible for overseeing compliance with this Law throughout the national territory and may issue complementary norms to regulate its provisions.
§1. The regulation may not, under any circumstances, authorize or result in the implementation of mechanisms of massive, generic, or indiscriminate surveillance, with practices against the fundamental rights to freedom of expression, privacy, full protection, and differentiated treatment of the personal data of children and adolescents being prohibited, under the terms of the Federal Constitution and Laws No. 8,069, of July 13, 1990 (Statute of the Child and Adolescent), and No. 13,709, of August 14, 2018 (General Personal Data Protection Law).
§2. In the activities provided for in the caput of this article, the competent authority shall observe regulatory asymmetries and adopt a responsive approach, ensuring differentiated and proportional treatment for services of distinct nature, risk, and business model.
CHAPTER XV
ON SANCTIONS
Art. 35. Without prejudice to other civil, criminal, or administrative sanctions, in case of non-compliance with the obligations provided for in this Law, with due process of law, the right to a full defense, and the adversarial principle being ensured, infringers shall be subject to the following penalties:
I – a warning, with a period of up to 30 (thirty) days for the adoption of corrective measures;
II – a simple fine, of up to 10% (ten percent) of the revenue of the economic group in Brazil in its last fiscal year or, in the absence of revenue, a fine from R$ 10.00 (ten reais) to R$ 1,000.00 (one thousand reais) per registered user of the sanctioned provider, limited, in total, to R$ 50,000,000.00 (fifty million reais) per infraction;
III – temporary suspension of activities;
IV – prohibition of the exercise of activities.
§1. For the determination and gradation of the sanction, the following circumstances shall be observed, in addition to proportionality and reasonableness:
I – the gravity of the infraction, considering its motives and the extent of the damage in the individual and collective spheres;
II – recidivism in the practice of infractions provided for in this Law;
III – the economic capacity of the infringer, in the case of the application of a fine;
IV – the social purpose of the provider and the impact on the community with regard to the flow of information in the national territory.
§2. In the case of a foreign company, its branch, subsidiary, office, or establishment located in the country shall be jointly and severally liable for the payment of the fine referred to in item II of the caput of this article.
§3. The process of investigating infractions of the provisions of this Law and of applying the applicable sanctions shall be governed by the provisions relating to the investigation of administrative infractions of the norms for the protection of children and adolescents and the imposition of the respective penalties provided for in Law No. 8,069, of July 13, 1990 (Statute of the Child and Adolescent).
§4. The amounts of the fines provided for in item II of the caput of this article shall be annually updated according to the National Broad Consumer Price Index (IPCA), calculated by the Brazilian Institute of Geography and Statistics Foundation (IBGE), or another that may replace it, and published in the official press by the competent body of the Executive Branch, in the form of a regulation.
§5. The penalties provided for in items I and II of the caput of this article shall be applied by the autonomous administrative authority for the protection of the rights of children and adolescents in the digital environment, and those provided for in items III and IV of the caput of this article shall be applied by the Judiciary.
§6. The temporary suspension and the prohibition of the exercise of activities provided for in items III and IV of the caput of this article, when not directly implemented by the infringer, shall be carried out by means of a blocking order addressed to the telecommunications service providers that provide internet connection, to the entities managing internet exchange points, to the domain name resolution service providers, and to the other agents that enable the connection between users and content servers on the internet. (Regulation)
§7. (VETOED).
Art. 36. (VETOED).
Art. 36-A. The amounts resulting from the fines applied based on this Law shall be allocated to the National Fund for Children and Adolescents, established by Law No. 8,242, of October 12, 1991, for a period of five years, to be necessarily used in policies and projects that have the objective of protecting children and adolescents. (Included by Provisional Measure No. 1,318, of 2025)
CHAPTER XVI
FINAL PROVISIONS
Art. 37. The Executive Branch shall regulate, where applicable, the provisions of this Law.
Sole Paragraph. The regulation may not, under any circumstances, impose, authorize, or result in the implementation of mechanisms of massive, generic, or indiscriminate surveillance, with practices that compromise the fundamental rights to freedom of expression, privacy, full protection, and differentiated treatment of the personal data of children and adolescents being prohibited, under the terms of the Federal Constitution and Laws No. 8,069, of July 13, 1990 (Statute of the Child and Adolescent), and No. 13,709, of August 14, 2018 (General Personal Data Protection Law).
Art. 38. The packaging of personal use electronic equipment sold in the country that allows internet access, manufactured in Brazil or imported, shall contain a sticker, in the Portuguese language, that informs parents or legal guardians of the need to protect children and adolescents from access to websites with content that is improper or inadequate for this age group, under the terms of the regulation.
Art. 39. The obligations provided for in Arts. 6, 17, 18, 19, 20, 27, 28, 29, 31, 32, and 40 of this Law shall apply according to the characteristics and functionalities of the information technology product or service, modulated according to the degree of interference of the provider of the product or service over the conveyed content made available, the number of users, and the size of the provider.
§1. Providers of services with editorial control and providers of content protected by copyright previously licensed from a responsible economic agent that is not an end user shall be exempt from complying with the obligations provided for in the articles referred to in the caput of this article, provided that:
I – they observe the age rating norms of the Executive Branch, when they exist, or, in their absence, the criteria for age appropriateness and clear signaling of potentially harmful content to children and adolescents, according to regulation;
II – they offer transparency in the age rating of the content;
III – they provide easily accessible technical parental mediation mechanisms that allow parents or legal guardians to exercise control over the way children and adolescents use the service, in order to enable the restriction of: a) content, by age group; b) personal data processed; c) interaction with other users; and d) commercial transactions;
IV – they offer accessible channels for receiving reports, exclusively regarding content that is non-compliant with the attributed rating or that violates the rights of children and adolescents, according to regulation.
§2. The obligations referred to in the caput of this article shall be applied in a manner proportional to the provider’s ability to influence, moderate, or intervene in the provision, circulation, or reach of the content accessible by children and adolescents.
§3. The regulation shall define objective criteria for assessing the degree of intervention and for the proportional application of the obligations provided for in this article.
Art. 40. The providers of the products or services referred to in Art. 1 of this Law shall maintain a legal representative in the country with powers to receive citations, summonses, or notifications, among others, in any judicial actions and administrative proceedings, as well as to respond before bodies and authorities of the Executive Branch, the Judiciary, and the Public Prosecutor’s Office, and to assume, on behalf of the foreign company, its responsibilities before the bodies and entities of the public administration.
Art. 41. (VETOED).
Art. 41-A. This Law shall enter into force six months after the date of its publication. (Included by Provisional Measure No. 1,319, of 2025)
Brasília, September 17, 2025; 204th year of Independence and 137th year of the Republic.
This text does not replace the one published in the DOU [Official Gazette of the Union] of 9.17.2025 – Extra edition
—————————————-
DECRETO 12.622/2025 (https://www.planalto.gov.br/ccivil_03/_ato2023-2026/2025/decreto/D12622.htm)
Presidency of the Republic Chief of Staff’s Office Special Secretariat for Legal Affairs
DECREE NO. 12,622, OF SEPTEMBER 17, 2025
Regulates Law No. 15,211, of September 17, 2025, to designate the National Data Protection Authority as the autonomous administrative authority for the protection of children and adolescents in digital environments, and to establish competencies for compliance with judicial blocking orders.
THE PRESIDENT OF THE REPUBLIC, in the use of the attributions conferred upon him by art. 84, caput, items IV and VI, paragraph “a”, of the Constitution, and in view of the provisions of Law No. 15,211, of September 17, 2025,
DECREES:
Art. 1. This Decree regulates art. 35, § 6, of Law No. 15,211, of September 17, 2025, to designate the National Data Protection Authority – ANPD as the autonomous administrative authority for the protection of children and adolescents in digital environments, and to establish competencies for the receipt of judicial blocking orders.
Art. 2. The ANPD is hereby designated as the autonomous administrative authority for the protection of the rights of children and adolescents in the digital environment, under the terms of the provisions of art. 2, caput, item X, of Law No. 15,211, of September 17, 2025.
Art. 3. The temporary suspension and the prohibition of the exercise of activities provided for in Law No. 15,211, of September 17, 2025, when not directly implemented by the infringer, shall be carried out by means of a blocking order.
§ 1. For compliance with the judicial blocking orders referred to in art. 35, § 6, of Law No. 15,211, of September 17, 2025, it shall be the responsibility of:
I – the National Telecommunications Agency – Anatel, to receive and distribute the orders to the telecommunications service providers that provide internet connection and to the other agents that enable the connection between users and content servers on the internet; and
II – the Brazilian Internet Steering Committee – CGI.br, to receive orders related to the resolution of name services registered under the “.br” domain.
§ 2. Anatel and CGI.br are granted the discretion, in accordance with the provisions of § 1, to define the most appropriate technique for implementing the blocking order.
Art. 4. This Decree shall enter into force on the date of its publication.
Brasília, September 17, 2025; 204th year of Independence and 137th year of the Republic.
This text does not replace the one published in the DOU [Official Gazette of the Union] of 09.18.2025.