Leia em português aqui
The Lei Geral de Proteção de Dados (LGPD), Brazil’s answer to data privacy, determines the rules organizations that handle personal data have to follow, unless they anonymize the data. This article delves into data anonymization, pseudonymization and what that means for data processing activities, as well as the LGPD’s stringent response time for access requests. We compare these selected aspects of the LGPD with the GDPR, and explore how technologies like those from Private AI can help organizations render data anonymized or pseudonymized efficiently, and comply with the onerous access request obligations.
Understanding Anonymization under the LGPD
- Definition and Scope: Anonymized data is defined as data related to a data subject who cannot be identified, considering the use of reasonable and available technical means at the time of the processing. The LGPD also defines anonymization as the use of reasonable and available technical means at the time of treatment, by which a given data loses the possibility of direct or indirect association with an individual. Lastly, the LGPD says that anonymized data shall not be considered personal data, except when the anonymization process to which it was submitted is reversed, using its own means, or when, with reasonable efforts, it may be reversed. This definition aligns somewhat with the GDPR, where anonymized data is also considered to be outside the scope of data protection laws because the data subject is not identifiable.
- Usage of Anonymized Data: Under the LGPD, once data is anonymized, it is no longer considered personal data and falls outside the act’s scope. This means organizations can use anonymized data freely without adhering to the privacy protections and rights obligations required for personal data. It offers a pathway for analytics, research, and other data-driven activities while maintaining compliance. The language of the law is not terribly strict on this point. For example, it says that processing of personal data shall only be carried out under the following circumstances, one of which reads: for carrying out studies by research entities, ensuring, whenever possible, the anonymization of personal data. The provision for the processing of sensitive personal data reads identically in this regard.
- Comparative Analysis with GDPR: The GDPR and LGPD share similarities in their approach to anonymization. Both consider anonymized data as non-personal, freeing it from the respective data protection regulations. However, the GDPR is more explicit about the irreversibility of anonymization, implying a higher standard for the process. An interesting detail in the LGPD is that it explicitly excludes data from the definition of anonymized data when they are used to formulate behavioral profiles of a particular natural person, if that person is identified. It is unclear what may have prompted this exclusion, since it is rather obvious that the definition of anonymized data would not apply in this scenario. Let’s take it as a signaling provision that emphasises the sensitivity of personal profiles.
The Role of Pseudonymization
- LGPD’s Stance on Pseudonymization: Pseudonymization under LGPD involves processing personal data in such a way that it can no longer be attributed to a specific data subject without the use of additional information, which must be kept separately by the controller in a controlled and secure environment. This process, while a valuable security measure, does not change the data’s status as personal under LGPD, unlike anonymization.
- Impact on Compliance: Pseudonymized data still requires adherence to the LGPD’s provisions. In fact, it is considered a recommended security measure when processing personal data for public health studies, an oddly narrow scope for this useful technique, by the way.
The Significance of Rapid Response to Access Requests
If the use case for processing the data does not allow for anonymization, data subjects have a right to access the information that is held by organizations governed under the LGPD. The law mandates a short response time of 15 days, compared to 30 under the GDPR, for data subject access requests, emphasizing the need for efficient data management systems. Organizations must be prepared to promptly identify, access, and compile personal data in response to these requests.
Private AI’s Contribution to LGPD Compliance
- Facilitating Efficient Data Mapping: Private AI’s technology can swiftly identify and categorize over 50 entities of personal data, a necessity for complying with LGPD’s access request deadlines. Particularly where unstructured data, such as free text is concerned, this can be a time-consuming process, depending on the amount of data in an organization’s system.
- Enhancing Anonymization Processes: By utilizing advanced algorithms, optimized for various file types, Private AI can help organizations effectively anonymize data, ensuring it falls outside of the LGPD’s purview.
- Supporting Multilingual and Context-Sensitive Processing: Private AI’s ability to handle diverse languages and contextual nuances aligns with the LGPD’s unparalleled territorial scope, likely capturing great linguistic diversity. The LGPD applies, unlike any other privacy law, not only to processing carried out in Brazil but also to processing related to providing goods or services to individuals in Brazil. The LGPD applies where “the personal data being processed were collected in the national territory” and explains that “data collected in the national territory are considered to be those whose data subject is in the national territory at the time of collection.” In summary, if an organization processes personal data related to individuals in Brazil, the LGPD applies regardless of the origin of that data. It would come in very handy if the tool used for personal data detection and redaction supported 52 languages!
Brazil’s LGPD places significant emphasis on the proper handling of personal data with much more obligations than covered here. This article highlighted that anonymization offers a gateway for organizations to utilize data without the constraints of the LGPD, provided the process is irreversible in light of reasonable measures taken. Additionally, the globally most stringent timeline for responding to access requests can likely not be met if attempted manually, given the vast amount of data many companies process today. In this context, Private AI’s technology emerges as a critical tool, enabling organizations to navigate these complex requirements efficiently and effectively, enhancing data privacy and security in Brazil’s digital ecosystem. Try it on your own data using our web demo or get a free API key here.