Quebec Law25 – Scope: Personal Information of Service Providers

Apr 14, 2023
Share this post
Sharing to FacebookSharing to LinkedInSharing to XSharing to Email

This guide is for you if you already know what Law25 is and have read some of the other excellent materials out there that tell you what your obligations are on a high level, but are still uncertain about some of the details. In particular, you are wondering about the scope of the definition of personal information and whether/to what extent it applies to the information you are gathering on professionals who are providing you with services. It is commonly thought that ‘business information’ falls outside of the scope of privacy laws, but there is more nuance to that than you may expect.

Scope

There is no shortage of information about the scope of Law25. You have probably heard that it applies to personal information, defined as “any information which relates to a natural person and directly or indirectly allows that person to be identified.” The information must be held by private organizations, or enterprises, as they are called in Québec.You may also be aware that there are several exceptions to what is considered personal information under the act. The one we zero in on here is professional information. While we first need to get a bit technical with the language of the legislation, the final take away is practical; namely, that a lot of work-related information you may dismiss as not falling under Law25 is in fact in scope. This means, for example, that access, rectification, and disposition requests by third-party service providers, such as brokers, agents, consultants, etc., fall under Law25 and that automated decision-making regarding their compensation based on, say, performance information gives rise to the information obligations under section 12.1. Likewise, the information obligations arising from using the profiling functionality and the restrictions on profiling apply equally to professional service providers.

Legal Analysis

Section 1(5) reads:Divisions II and III of this Act do not apply to personal information which by law is public. Nor do they apply to personal information concerning the performance of duties within an enterprise by the person concerned, such as the person’s name, title and duties, as well as the address, email address and telephone number of the person’s place of work.Three things to note:1. First, and most straightforwardly, section 1(5) limits the exclusion of work-related information from the application of the Act to Divisions II and III of the act. However, the right to access, rectification, and disposition, for example, are contained in Division IV, hence the information described in section 1(5) is still subject to the right to access, rectification, and disposition.2. Second, let’s look at the scope of the exclusion, in particular the phrase “within an enterprise.” At first glance it may seem that this phrase indicates that the excluded information relates to individuals working for the enterprise that collects the information. However, it is more plausible that the term “within an enterprise” simply means that the duties are performed in an organized economic capacity and does not limit the exception in section 1(5) to information collected on personnel that works for the enterprise that is collecting the information. The definition of “carrying on an enterprise” for the purposes of Law25 is borrowed from article 1525 Civil Code, according to section 1 of Law25:

The carrying on by one or more persons of an organized economic activity, whether or not it is commercial in nature, consisting of producing, administering or alienating property, or providing a service, constitutes the operation of an enterprise.

Consequently, the phrase “within an enterprise” ensures that the duties captured by section 1(5) are not of a domestic or social nature. It follows that information gathered on external professionals is, to the extent that section 1(5) says so, also excluded from the definition of personal information.3. Third, despite the fact that section 1(5) explicitly excludes Divisions II and III of the act from the application to personal, work-related information, there are important obligations contained in these Divisions that apply to personal information related to the performance of professional duties. This is because section 1(5)’s “concerning the performance of duties” must be interpreted narrowly. The first hint that a narrow interpretation is appropriate is that the listed examples are little more than what you’d expect to see on a professional profile on a company’s website or in a job description. Further contextual reading, particularly of section 8.1, the provision on identifying, locating, and profiling, and more specifically the definition of profiling contained in section 8.1(3), suggests that work performance falls within the scope of Law25. While work performance is not explicitly mentioned as an example of personal information, it is mentioned alongside well-established examples of personal information, such as economic situation and health.

Obligations Regarding Work-Related Personal Information

Our analysis has shown that important obligations under Law25 must still be met where professional information is concerned.Right to Access, Rectification, and DispositionIf you are collecting personal information related to the work that someone performs, whether this is internal personnel, or external professionals such as brokers, agents, consultants, etc., all of this information falls under the right to access and rectification as well as disposition obligations set out in sections 27 and 28 of Law25.Upon request, you are obliged to confirm the existence of this information, communicate it to the individual, and allow the individual to obtain a copy.Viewing the information must be free of charge under article 38 of the Civil Code, which applies in this context due to section 1.1 of Law25. A copy must be provided at a reasonable cost according to the same provision.The right to rectification provided by Law25 requires organizations to rectify inaccurate, incomplete, or equivocal information about the individual who so requests.The right to cause disposition of personal information is a bit hidden. Section 28 makes reference to section 40(1) of the Civil Code which, in turn, provides that individuals may cause obsolete information or information not justified for the purpose of the file to be deleted.Identifying, Locating, and ProfilingAs mentioned above, it seems likely that the information obligation arising from the use of technologies that collect personal information and allow an individual to be identified, located, or profiled, apply to professionals as well. The reason is that the definition of “profiling” explicitly captures the collection and use of personal information for purposes of analyzing a person’s work performance. It follows that information regarding the work performance cannot at the same time be excluded from the application of the act by section 1(5).Hence, if information on the work performance of an individual is collected and used for profiling, this seems to trigger the obligation to inform the individual accordingly and further inform them of the means available to activate the functions that allow for the profiling. In a previous version of the act, the proposal was to require individuals to be informed about how to deactivate the identification, locating, or profiling functionality. It seems, therefore, that the default setting must be set to those functionalities being turned off.However, it is also possible to argue that the spirit of the law and the letter of the law differ in this respect. For example, if your organization collects performance data on individuals you are partnering with in such a way that the definition of profiling is met but there is no risk of any human rights violation or discrimination, in particular because there is no dependency relationship as there would be with employees, an argument could be made that the law has not contemplated this situation and was not intended to restrict this kind of profiling.Automated Decision-MakingHaving established that work performance information does not fall under section 1(5), it also follows that the information obligations triggered by automated decision-making may apply when work-related information is processed and a decision is made automatically, i.e., without human intervention or oversight, on the basis of that information.For example, if a decision regarding someone’s commission is made on the basis of a score determined exclusively by an algorithm that analyses all the information collected on the efficiency of the individual’s work, the individual, upon their request, must be informed:(1) of the personal information used to render the decision;(2) of the reasons and the principal factors and parameters that led to the decision; and(3) of the right of the person concerned to have the personal information used to render the decision corrected.This information must be provided, at the latest, when the decision is communicated to the individual.

Conclusion

As is often the case with the law, the devil lies in the details. What is ‘commonly known’ about the law, such as that privacy laws do not apply to business information, is often insufficiently nuanced and can give rise to serious compliance issues and hefty fines. Something as fundamental as the scope of the law’s application needs to be carefully determined as it forms the basis for practically all other decisions regarding the operationalization of the legal requirements.

Data Left Behind: AI Scribes’ Promises in Healthcare

Data Left Behind: Healthcare’s Untapped Goldmine

The Future of Health Data: How New Tech is Changing the Game

Why is linguistics essential when dealing with healthcare data?

Why Health Data Strategies Fail Before They Start

Private AI to Redefine Enterprise Data Privacy and Compliance with NVIDIA

EDPB’s Pseudonymization Guideline and the Challenge of Unstructured Data

HHS’ proposed HIPAA Amendment to Strengthen Cybersecurity in Healthcare and how Private AI can Support Compliance

Japan's Health Data Anonymization Act: Enabling Large-Scale Health Research

What the International AI Safety Report 2025 has to say about Privacy Risks from General Purpose AI

Private AI 4.0: Your Data’s Potential, Protected and Unlocked

How Private AI Facilitates GDPR Compliance for AI Models: Insights from the EDPB's Latest Opinion

Navigating the New Frontier of Data Privacy: Protecting Confidential Company Information in the Age of AI

Belgium’s Data Protection Authority on the Interplay of the EU AI Act and the GDPR

Enhancing Compliance with US Privacy Regulations for the Insurance Industry Using Private AI

Navigating Compliance with Quebec’s Act Respecting Health and Social Services Information Through Private AI’s De-identification Technology

Unlocking New Levels of Accuracy in Privacy-Preserving AI with Co-Reference Resolution

Strengthened Data Protection Enforcement on the Horizon in Japan

How Private AI Can Help to Comply with Thailand's PDPA

How Private AI Can Help Financial Institutions Comply with OSFI Guidelines

The American Privacy Rights Act – The Next Generation of Privacy Laws

How Private AI Can Help with Compliance under China’s Personal Information Protection Law (PIPL)

PII Redaction for Reviews Data: Ensuring Privacy Compliance when Using Review APIs

Independent Review Certifies Private AI’s PII Identification Model as Secure and Reliable

To Use or Not to Use AI: A Delicate Balance Between Productivity and Privacy

To Use or Not to Use AI: A Delicate Balance Between Productivity and Privacy

News from NIST: Dioptra, AI Risk Management Framework (AI RMF) Generative AI Profile, and How PII Identification and Redaction can Support Suggested Best Practices

Handling Personal Information by Financial Institutions in Japan – The Strict Requirements of the FSA Guidelines

日本における金融機関の個人情報の取り扱い - 金融庁ガイドラインの要件

Leveraging Private AI to Meet the EDPB’s AI Audit Checklist for GDPR-Compliant AI Systems

Who is Responsible for Protecting PII?

How Private AI can help the Public Sector to Comply with the Strengthening Cyber Security and Building Trust in the Public Sector Act, 2024

A Comparison of the Approaches to Generative AI in Japan and China

Updated OECD AI Principles to keep up with novel and increased risks from general purpose and generative AI

Is Consent Required for Processing Personal Data via LLMs?

The evolving landscape of data privacy legislation in healthcare in Germany

The CIO’s and CISO’s Guide for Proactive Reporting and DLP with Private AI and Elastic

The Evolving Landscape of Health Data Protection Laws in the United States

Comparing Privacy and Safety Concerns Around Llama 2, GPT4, and Gemini

How to Safely Redact PII from Segment Events using Destination Insert Functions and Private AI API

WHO’s AI Ethics and Governance Guidance for Large Multi-Modal Models operating in the Health Sector – Data Protection Considerations

How to Protect Confidential Corporate Information in the ChatGPT Era

Unlocking the Power of Retrieval Augmented Generation with Added Privacy: A Comprehensive Guide

Leveraging ChatGPT and other AI Tools for Legal Services

Leveraging ChatGPT and other AI tools for HR

Leveraging ChatGPT in the Banking Industry

Law 25 and Data Transfers Outside of Quebec

The Colorado and Connecticut Data Privacy Acts

Unlocking Compliance with the Japanese Data Privacy Act (APPI) using Private AI

Tokenization and Its Benefits for Data Protection

Private AI Launches Cloud API to Streamline Data Privacy

Processing of Special Categories of Data in Germany

End-to-end Privacy Management

Privacy Breach Reporting Requirements under Law25

Migrating Your Privacy Workflows from Amazon Comprehend to Private AI

A Comparison of the Approaches to Generative AI in the US and EU

Benefits of AI in Healthcare and Data Sources (Part 1)

Privacy Attacks against Data and AI Models (Part 3)

Risks of Noncompliance and Challenges around Privacy-Preserving Techniques (Part 2)

Enhancing Data Lake Security: A Guide to PII Scanning in S3 buckets

The Costs of a Data Breach in the Healthcare Sector and its Privacy Compliance Implications

Navigating GDPR Compliance in the Life Cycle of LLM-Based Solutions

What’s New in Version 3.8

How to Protect Your Business from Data Leaks: Lessons from Toyota and the Department of Home Affairs

New York's Acceptable Use of AI Policy: A Focus on Privacy Obligations

Safeguarding Personal Data in Sentiment Analysis: A Guide to PII Anonymization

Changes to South Korea’s Personal Information Protection Act to Take Effect on March 15, 2024

Australia’s Plan to Regulate High-Risk AI

How Private AI can help comply with the EU AI Act

Comment la Loi 25 Impacte l'Utilisation de ChatGPT et de l'IA en Général

Endgültiger Entwurf des Gesetzes über Künstliche Intelligenz – Datenschutzpflichten der KI-Modelle mit Allgemeinem Verwendungszweck

How Law25 Impacts the Use of ChatGPT and AI in General

Is Salesforce Law25 Compliant?

Creating De-Identified Embeddings

Exciting Updates in 3.7

EU AI Act Final Draft – Obligations of General-Purpose AI Systems relating to Data Privacy

FTC Privacy Enforcement Actions Against AI Companies

The CCPA, CPRA, and California's Evolving Data Protection Landscape

HIPAA Compliance – Expert Determination Aided by Private AI

Private AI Software As a Service Agreement

EU's Review of Canada's Data Protection Adequacy: Implications for Ongoing Privacy Reform

Acceptable Use Policy

ISO/IEC 42001: A New Standard for Ethical and Responsible AI Management

Reviewing OpenAI's 31st Jan 2024 Privacy and Business Terms Updates

Comparing OpenAI vs. Azure OpenAI Services

Quebec’s Draft Regulation Respecting the Anonymization of Personal Information

Version 3.6 Release: Enhanced Streaming, Auto Model Selection, and More in Our Data Privacy Platform

Brazil's LGPD: Anonymization, Pseudonymization, and Access Requests

LGPD do Brasil: Anonimização, Pseudonimização e Solicitações de Acesso à Informação

Canada’s Principles for Responsible, Trustworthy and Privacy-Protective Generative AI Technologies and How to Comply Using Private AI

Private AI Named One of The Most Innovative RegTech Companies by RegTech100

Data Integrity, Data Security, and the New NIST Cybersecurity Framework

Safeguarding Privacy with Commercial LLMs

Cybersecurity in the Public Sector: Protecting Vital Services

Privacy Impact Assessment (PIA) Requirements under Law25

Elevate Your Experience with Version 3.5

Fine-Tuning LLMs with a Focus on Privacy

GDPR in Germany: Challenges of German Data Privacy (Part 2)

Comply with US Executive Order on Safe, Secure, and Trustworthy Artificial Intelligence using Private AI

How to Comply with EU AI Act using PrivateGPT