The CCPA, CPRA, and California's Evolving Data Protection Landscape

Feb 1, 2024
Share this post
Sharing to FacebookSharing to LinkedInSharing to XSharing to Email

The California Consumer Privacy Act (CCPA) has been in effect since January 1, 2020, and it has impacted the way businesses collect, use, and protect consumers' personal information considerably. Just over 9 months later, California voters approved Proposition 24, the California Privacy Rights Act (CPRA), which amends the CCPA and strengthens California's data privacy laws further. Most of the CPRA provisions came into force on January 1, 2023. Enforcement will commence on July 1, 2023. On March 29, 2023, the California Consumer Privacy Act Regulations came into force, complementing the CPRA.To prevent your head from spinning in this veritable whirlwind of new privacy legislations, this blog post will provide an overview of the CCPA, the CPRA, and corresponding regulations with a focus on the data they protect, the concept of de-identification, and the impact of the enactment of the CPRA.

Scope of Application

The CCPA, as amended by the CPRA, applies to any business that collects, processes, sells, or shares the personal information of California residents and meets certain thresholds. These thresholds include having an annual gross revenue of over $25 million, collecting the personal information of 50,000 or more California residents, households, or devices annually, or deriving 50 percent or more of their annual revenue from selling the personal information of California residents.It's important to note that the CCPA and CPRA have extraterritorial reach, meaning that businesses located outside of California that meet the requirements are also subject to these laws. Furthermore, the thresholds are calculated taking into account overall revenue, not just that revenue made in California or from selling personal information from California residents.

Information Protected Under the CCPA, as Amended

As amended by the CPRA, the following categories of personal information are covered under the CCPA (section 1798.140.(v)(1) Definitions):Identifiers: These include names, aliases, postal addresses, unique personal identifiers, online identifiers, IP addresses, email addresses, account names, social security numbers, drivers license numbers, passport numbers, and other similar identifiers.Personal information: This includes data like physical characteristics or description, such as height, weight, or gender; education information such as school and degree information; employment information such as work history, salary, and benefits information.Commercial information: This category includes records of products or services purchased, obtained, or considered, as well as other purchasing or consuming histories or tendencies.Biometric information: This includes data based on unique biological characteristics such as facial recognition, iris or retina scans, fingerprints, voiceprints, and other similar information.Internet or other electronic network activity information: This includes browsing history, search history, and information regarding a consumer's interaction with an internet website, application, or advertisement.Geolocation data: This category includes precise location data about a consumer, as well as general location data.Audio, electronic, visual, thermal, olfactory, or similar information: This includes data such as audio recordings, video recordings, and other types of sensory information.Professional or employment-related information: This includes data such as employment history, professional licenses, or certifications.Education information: This includes data such as student records, transcripts, and other similar information.Inferences drawn from other personal information: This includes information that is used to create a profile about a consumer, such as preferences, characteristics, psychological trends, predispositions, behavior, attitudes, intelligence, abilities, and aptitudes.It's important to note that the CPRA includes a new and broad category of sensitive personal information, including what you would probably expect, such as social security number and religious beliefs, but also text and email messages. However, the difference with regard to a business’ obligation under the amended CCPA between personal information and sensitive information is limited. The disclosure provisions are identical and, for both, businesses must provide the option to opt out of selling and sharing this information. In the case of sensitive information, consumers have the additional right to also opt out of specific usage and disclosure of this information. Practically speaking, businesses must provide a "Do Not Sell My Personal Information" and a “Limit the Use of My Sensitive Personal Information” button on their website.

De-identification Under the CCPA

The CCPA excludes de-identified information from the scope of personal information, meaning that the act does not apply to such data. “Deidentified” means information that cannot reasonably be used to infer information about, or otherwise be linked to, a particular consumer. However, to be exempt from the application of the CCPA, the business that de-identifies the data must also put specific safeguards in place to ensure the data will not be re-identified. It is required that the business(1) Takes reasonable measures to ensure that the information cannot be associated with a consumer or household.(2) Publicly commits to maintain and use the information in deidentified form and not to attempt to reidentify the information, except that the business may attempt to reidentify the information solely for the purpose of determining whether its deidentification processes satisfy the requirements of this subdivision.(3) Contractually obligates any recipients of the information to comply with all provisions of this subdivision.Compared to the General Data Protection Regulation (GDPR) in the EU, for example, de-identification under the CCPA is less onerous than the GDPR’s anonymization, which requires that information no longer relate to an identified or identifiable natural person or to personal data rendered anonymous in such a manner that the data subject is not or no longer identifiable. For the GDPR it does not suffice that data that cannot “reasonably” be used to identify an individual in order to consider the data anonymized; the GDPR’ standard is more absolute than that.However, the method to de-identify/anonymize the data will be the same. Both laws require the removal of direct and indirect identifiers and the evaluation of the re-identification risk, with the CPPA then making the additional demands of businesses described above. Private AI is uniquely equipped to help with the de-identification of personal data under both the GDPR and the CCPA. The vast majority of the data that falls under the CCPA’s definition of personal information that we set out above are captured by Private AI’s supported entities, i.e., entities that our algorithms can detect with unparalleled accuracy in unstructured data and numerous file formats as well as in over 50 languages. Exceptions include inferences drawn from personal information as well as purchasing history.

Consumer Rights

Other than the right to opt out of the sale and sharing of their personal information and to limit the use and disclosure of sensitive information, California residents have the right to know what personal information businesses collect about them, and the right to request that businesses correct or delete their personal information.Businesses subject to the CCPA must provide California residents with a privacy policy that outlines the categories of personal information collected, the purposes for which the information is used, and the categories of third parties with whom the information is shared.The CPRA also expands consumers' right to know about automated decision-making and profiling based on their information.

Effect of the Amendments

The CPRA creates a new enforcement agency, the California Privacy Protection Agency (CPPA), which has the power to investigate violations, issue subpoenas, hold public hearings and, importantly, impose fines and penalties for violations of the CCPA and CPRA.The coming into force of the CPPA in January 2020 seems not to have affected ad revenues of businesses much at all – only 1-5 percent of consumers have opted out (survey results from November 2020). Compared to the impact of Apple’s App Tracking Transparency (ATT) roll-out, one could speculate that the reason is that the CPPA requires only an opt-out, whereas ATT provides the consumer with data protection by default. ATT requires that apps ask users whether they can track them, and over 80 percent politely declined. Yet, when users must take active steps to protect their rights, people’s innate indolence may get in the way of enjoying the right of not having personal information be sold or shared for profit. Similarly, access requests have also been far from numerous.An interesting effect was, however, that many companies initially opted not to sell information in the first place, being concerned about the optics on the required button on their website. Furthermore, privacy-preserving advertising is becoming increasingly popular with new solutions capturing additional market share.

Conclusion

The CCPA and CPRA have created a new era of data privacy regulation in California, and businesses must take these laws seriously to avoid fines and penalties for non-compliance. Compliance with the CCPA and CPRA requires a significant investment of time and resources, but it is essential for businesses that collect, use, or share personal information. By implementing robust data privacy programs and staying up-to-date on the latest developments, businesses can protect consumer privacy and build trust with their customers. On the other hand, the limited extent to which consumers make use of their new rights contrasted with the clear industry trend towards privacy-preserving advertisement as the result of new privacy laws underscores the fact that privacy protection cannot be the responsibility of the consumer. Rather the behaviour of organizations has to change. And the threat of fines and reputational loss does a pretty good job at that as the IAPP-EY Privacy Governance Report 2023 has shown, which shows that 33 percent of companies grew their privacy teams over the past year alone.

Data Left Behind: AI Scribes’ Promises in Healthcare

Data Left Behind: Healthcare’s Untapped Goldmine

The Future of Health Data: How New Tech is Changing the Game

Why is linguistics essential when dealing with healthcare data?

Why Health Data Strategies Fail Before They Start

Private AI to Redefine Enterprise Data Privacy and Compliance with NVIDIA

EDPB’s Pseudonymization Guideline and the Challenge of Unstructured Data

HHS’ proposed HIPAA Amendment to Strengthen Cybersecurity in Healthcare and how Private AI can Support Compliance

Japan's Health Data Anonymization Act: Enabling Large-Scale Health Research

What the International AI Safety Report 2025 has to say about Privacy Risks from General Purpose AI

Private AI 4.0: Your Data’s Potential, Protected and Unlocked

How Private AI Facilitates GDPR Compliance for AI Models: Insights from the EDPB's Latest Opinion

Navigating the New Frontier of Data Privacy: Protecting Confidential Company Information in the Age of AI

Belgium’s Data Protection Authority on the Interplay of the EU AI Act and the GDPR

Enhancing Compliance with US Privacy Regulations for the Insurance Industry Using Private AI

Navigating Compliance with Quebec’s Act Respecting Health and Social Services Information Through Private AI’s De-identification Technology

Unlocking New Levels of Accuracy in Privacy-Preserving AI with Co-Reference Resolution

Strengthened Data Protection Enforcement on the Horizon in Japan

How Private AI Can Help to Comply with Thailand's PDPA

How Private AI Can Help Financial Institutions Comply with OSFI Guidelines

The American Privacy Rights Act – The Next Generation of Privacy Laws

How Private AI Can Help with Compliance under China’s Personal Information Protection Law (PIPL)

PII Redaction for Reviews Data: Ensuring Privacy Compliance when Using Review APIs

Independent Review Certifies Private AI’s PII Identification Model as Secure and Reliable

To Use or Not to Use AI: A Delicate Balance Between Productivity and Privacy

To Use or Not to Use AI: A Delicate Balance Between Productivity and Privacy

News from NIST: Dioptra, AI Risk Management Framework (AI RMF) Generative AI Profile, and How PII Identification and Redaction can Support Suggested Best Practices

Handling Personal Information by Financial Institutions in Japan – The Strict Requirements of the FSA Guidelines

日本における金融機関の個人情報の取り扱い - 金融庁ガイドラインの要件

Leveraging Private AI to Meet the EDPB’s AI Audit Checklist for GDPR-Compliant AI Systems

Who is Responsible for Protecting PII?

How Private AI can help the Public Sector to Comply with the Strengthening Cyber Security and Building Trust in the Public Sector Act, 2024

A Comparison of the Approaches to Generative AI in Japan and China

Updated OECD AI Principles to keep up with novel and increased risks from general purpose and generative AI

Is Consent Required for Processing Personal Data via LLMs?

The evolving landscape of data privacy legislation in healthcare in Germany

The CIO’s and CISO’s Guide for Proactive Reporting and DLP with Private AI and Elastic

The Evolving Landscape of Health Data Protection Laws in the United States

Comparing Privacy and Safety Concerns Around Llama 2, GPT4, and Gemini

How to Safely Redact PII from Segment Events using Destination Insert Functions and Private AI API

WHO’s AI Ethics and Governance Guidance for Large Multi-Modal Models operating in the Health Sector – Data Protection Considerations

How to Protect Confidential Corporate Information in the ChatGPT Era

Unlocking the Power of Retrieval Augmented Generation with Added Privacy: A Comprehensive Guide

Leveraging ChatGPT and other AI Tools for Legal Services

Leveraging ChatGPT and other AI tools for HR

Leveraging ChatGPT in the Banking Industry

Law 25 and Data Transfers Outside of Quebec

The Colorado and Connecticut Data Privacy Acts

Unlocking Compliance with the Japanese Data Privacy Act (APPI) using Private AI

Tokenization and Its Benefits for Data Protection

Private AI Launches Cloud API to Streamline Data Privacy

Processing of Special Categories of Data in Germany

End-to-end Privacy Management

Privacy Breach Reporting Requirements under Law25

Migrating Your Privacy Workflows from Amazon Comprehend to Private AI

A Comparison of the Approaches to Generative AI in the US and EU

Benefits of AI in Healthcare and Data Sources (Part 1)

Privacy Attacks against Data and AI Models (Part 3)

Risks of Noncompliance and Challenges around Privacy-Preserving Techniques (Part 2)

Enhancing Data Lake Security: A Guide to PII Scanning in S3 buckets

The Costs of a Data Breach in the Healthcare Sector and its Privacy Compliance Implications

Navigating GDPR Compliance in the Life Cycle of LLM-Based Solutions

What’s New in Version 3.8

How to Protect Your Business from Data Leaks: Lessons from Toyota and the Department of Home Affairs

New York's Acceptable Use of AI Policy: A Focus on Privacy Obligations

Safeguarding Personal Data in Sentiment Analysis: A Guide to PII Anonymization

Changes to South Korea’s Personal Information Protection Act to Take Effect on March 15, 2024

Australia’s Plan to Regulate High-Risk AI

How Private AI can help comply with the EU AI Act

Comment la Loi 25 Impacte l'Utilisation de ChatGPT et de l'IA en Général

Endgültiger Entwurf des Gesetzes über Künstliche Intelligenz – Datenschutzpflichten der KI-Modelle mit Allgemeinem Verwendungszweck

How Law25 Impacts the Use of ChatGPT and AI in General

Is Salesforce Law25 Compliant?

Creating De-Identified Embeddings

Exciting Updates in 3.7

EU AI Act Final Draft – Obligations of General-Purpose AI Systems relating to Data Privacy

FTC Privacy Enforcement Actions Against AI Companies

The CCPA, CPRA, and California's Evolving Data Protection Landscape

HIPAA Compliance – Expert Determination Aided by Private AI

Private AI Software As a Service Agreement

EU's Review of Canada's Data Protection Adequacy: Implications for Ongoing Privacy Reform

Acceptable Use Policy

ISO/IEC 42001: A New Standard for Ethical and Responsible AI Management

Reviewing OpenAI's 31st Jan 2024 Privacy and Business Terms Updates

Comparing OpenAI vs. Azure OpenAI Services

Quebec’s Draft Regulation Respecting the Anonymization of Personal Information

Version 3.6 Release: Enhanced Streaming, Auto Model Selection, and More in Our Data Privacy Platform

Brazil's LGPD: Anonymization, Pseudonymization, and Access Requests

LGPD do Brasil: Anonimização, Pseudonimização e Solicitações de Acesso à Informação

Canada’s Principles for Responsible, Trustworthy and Privacy-Protective Generative AI Technologies and How to Comply Using Private AI

Private AI Named One of The Most Innovative RegTech Companies by RegTech100

Data Integrity, Data Security, and the New NIST Cybersecurity Framework

Safeguarding Privacy with Commercial LLMs

Cybersecurity in the Public Sector: Protecting Vital Services

Privacy Impact Assessment (PIA) Requirements under Law25

Elevate Your Experience with Version 3.5

Fine-Tuning LLMs with a Focus on Privacy

GDPR in Germany: Challenges of German Data Privacy (Part 2)

Comply with US Executive Order on Safe, Secure, and Trustworthy Artificial Intelligence using Private AI

How to Comply with EU AI Act using PrivateGPT