Navigating Compliance with Quebec’s Act Respecting Health and Social Services Information Through Private AI’s De-identification Technology

Kathrin Gardhouse
Oct 29, 2024
Share this post
Sharing to FacebookSharing to LinkedInSharing to XSharing to Email

Quebec’s new Act Respecting Health and Social Services Information (ARHSSI) introduces a notable tightening of data privacy requirements within the province, with a distinct emphasis on safeguarding health and social services information.

The Act mandates that all information held by certain public bodies and potentially entrusted to third parties must remain confidential unless explicitly authorized by the individual to whom it relates. A particularly unique aspect of this Act is the obligation to use or communicate information in a de-identified form whenever possible. For organizations operating in Quebec, this represents a stringent requirement that applies across all use cases, making compliance both critical and challenging.

In this article, we provide an overview of the Act and explore how Private AI’s state-of-the-art de-identification technology can help organizations comply with some of these obligations, reduce risk exposure, and avoid the steep fines associated with breaches of the ARHSSI.

Scope

The Act designates a broad range of organizations as health and social services bodies (HSSBs) which are subject to the Act. This includes key public health entities like the Ministry of Health and Social Services, the Health and Welfare Commissioner, the Commission on End of Life Care, and the Régie de l’assurance maladie du Québec (Health Insurance Board). Specialized institutions like Héma-Québec, which manages blood services, and the Institut national de santé publique du Québec (National Public Health Institute) are also included. The list further extends to private facilities, such as specialized medical centers, private seniors' residences, assisted procreation centers, and funeral service providers. These bodies, subject to the Act, reflect a comprehensive approach to encompass public, private, and specialized health and social services providers across the province.

"Health and social services information," as defined by this Act, encompasses any data that can identify an individual, directly or indirectly, and pertains to their physical or mental health, medical history, biological samples, or use of disability aids. It also includes information about the specific health or social services received, including service details, outcomes, and provider identity. Personal identifiers like name, birth date, and health insurance number are also considered health information when linked to these data or collected during registration or care admission. However, information collected for human resources purposes about health workers or contractors is excluded from this definition.

Key Highlights of ARHSSI

The ARHSSI constitutes strong protection for HSSI with wide-reaching implications for entities that handle HSSI in Quebec. While the Act introduces many important measures, the following highlights capture the core obligations of organizations under its scope:

  1. Collection, Use, Disclosure, and Retention of Information: Organizations must ensure that the collection, use, disclosure, and retention of personal health information are done transparently and with clear justification. The collection must be limited to the minimum necessary. As a default, consent is required for the use and communication of HSSI. In case of communication of HSSI outside of Quebec, a privacy impact assessment (PIA) is necessary prior to the communication. HSSI must only be retained for as long as necessary, and safeguards must be in place to protect its confidentiality.
  2. Obligatory De-Identification: A central provision of the ARHSSI is that personal information must be used or communicated in a de-identified form whenever possible. This requirement applies broadly to the internal use of data by organizations, as well as to external communications, including with service providers and researchers.
  3. Access Restrictions: Individuals have the right to restrict access to their HSSI or to certain pieces of information by particular (category of) service providers, their relatives, and by researchers or for certain research projects, with very limited exceptions. The Act also provides for access limitations regarding internal personnel, ensuring that access is limited to those purposes for which it was collected or consistent purposes, with few additional permissions.
  4. Rights of Access to Information: Individuals have a right to access their own personal information, and certain related persons (e.g., guardians, family members of minors, and relatives of deceased persons) also have defined access rights. Service providers and researchers, too, have access rights under certain conditions. Researchers, in particular, have to comply with stringent requirements around their access request, including submitting a PIA along with their access request. Organizations must be prepared to facilitate these requests and comply with any limitations and the detailed procedures imposed by the Act.
  5. Technological Products: An organization subject to the Act must conduct a PIA for any project to acquire, develop or overhaul technological products or services or an electronic service delivery system where the project involves the collection, keeping, use, communication or destruction of information held by the organization. The PIA is not required if the technological product is certified by a procedure determined by regulation, and in the process of obtaining this certification, a PIA had already been conducted. The organization must also record all technological products it uses in a register which it has to publish on its website or by other means.
  6. Confidentiality Incident: The obligations surrounding confidentiality incidents are notably broad under the Act. Risk mitigation and new incident prevention obligations are already triggered when there is a risk of a confidentiality incident occurring, not just once it has occurred. A regulation accompanying the Act seems to imply, however, that notification obligations only apply once the incident has occurred. The regulation also sets out the details of the notices required.  
  7. Governance and Responsibilities: The Act introduces stringent governance measures, assigning key roles such as the Network Information Officer to oversee compliance. The Minister of Health and Social Services also has an oversight role, responsible for ensuring that bodies comply with the Act’s standards for information governance.
  8. Oversight and Penalties: The Act empowers authorities to perform inspections, investigations, and impose significant penalties for violations. Penalties for breaches, including improper communication of information, range from $5,000 to $100,000 for individuals and from $15,000 to $150,000 for organizations. These penalties underscore the importance of adhering to the Act’s stringent privacy requirements.

Data Minimization, De-Identification, and Anonymization

Focusing on the obligations under the Act that Private AI’s redaction technology can most directly assist with, we dedicate this section to data minimization, de-identification, and anonymization.

Data minimization requires, as we briefly touched upon above, that only the HSSI is collected that is necessary to fulfil the purposes for which it is collected. The retention limitation to the period of time that it is necessary to keep the data to fulfil its purpose can also be captured under this principle, but the mechanisms to meet these two requirements are different: when minimizing the collection of data, data intake forms have to be scrutinized, or technological solution have to be implemented to automatically block the collection of unnecessary data. For retention limitation, destruction of the data after a certain period of time is required, or, as the Act clarifies, the data could alternatively be anonymized.

Although the regulator released a regulation that adds some details regarding how destruction of HSSI needs to be performed, it regrettably is silent on what is required for anonymization. Organizations might be well advised to rely on the anonymization regulation under Law 25 for guidance.

Moving on from collection and destruction/anonymization to use and communication of HSSI, the Act is quite unique in that it explicitly requires the de-identification of HSSI "where such information can be used or communicated in a form that does not allow the person concerned to be identified directly.” This requirement is not limited to a particular use case, such as research, for example, however, the reality may often be that research is the most common use case where HSSI can be used in a form that does not allow for direct identification.

Nevertheless, for organizations processing health and social services information, this presents an operational challenge—ensuring that every instance of communication or use of personal data defaults to de-identification regardless of whether consent has been obtained.

Private AI’s Role in Ensuring Compliance

Private AI’s innovative privacy-enhancing technology is designed specifically to address the complexities associated with de-identification of personal data, particularly in regulated sectors like healthcare and social services. The company’s machine-learning models are uniquely suited to automatically detect and redact or remove personally identifiable information (PII) and protected health information (PHI), allowing organizations to greatly facilitate adherence to the Act’s rigorous requirements.

Here’s how Private AI supports compliance with the ARHSSI:

  1. Automated De-identification at Scale: The ARHSSI requires that information must be used or communicated in a de-identified form whenever possible. Private AI’s technology automates the detection and redaction of such data in real-time, whether it is structured (like databases) or unstructured (like emails, reports, doctors’ notes, and even audio or video files). This capability allows organizations to ensure that information is de-identified by default, reducing human error and administrative burdens. It also allows for granular selection of the entities that are required to be removed, a prerequisite for the flexibility that is needed when minimizing the HSSI used for different purposes.
  2. Seamless Integration with Existing Data Systems: Mindful of the Act’s high bar for communicating HSSI outside of the province, Private AI’s solutions can be deployed on-premises or through secure API integrations, connecting to servers within Canada, where required, ensuring that de-identification happens without data leaving the organization’s controlled environment, or the country.
  3. Meeting Anonymization Standards: The ARHSSI mirrors the anonymization definition from Law 25, requiring that anonymization renders data irreversibly unidentifiable. Private AI supports these requirements by applying context-aware de-identification that ensures data is redacted according to generally accepted best practices. While it depends on the dataset and the use case whether this amounts to anonymization rather than mere de-identification, the removal of direct and indirect identifiers is always the first, and often very onerous step, when conducting anonymization.
  4. PIAs and Breach Reporting: Private AI is not only good at redacting PII. As a first step before redacting the PII, the technology first has to identify where in the data there is PII. This is hard to do, especially in unstructured data. For both PIAs and confidentiality incident reporting it is essential to accurately determine what and how much HSSI is present in any given IT system. Using Private AI, this can be automated even for free text fields in databases, embedded files, storage buckets and network drives with handwritten content on PDF scans, Word documents and images.
  5. Preventing Regulatory Fines: Organizations that fail to comply with the ARHSSI’s de-identification mandates are exposed to significant fines. For individuals, fines range from $5,000 to $100,000, while for legal entities, they can be as high as $150,000 if information that cannot be communicated under the Act is nevertheless communicated. This amounts to severe punishment for the failure to de-identify HSSI for uses that do not require full identifiability. By implementing Private AI’s technology, organizations can proactively ensure that all personal data is de-identified when communicated, significantly lowering the risk of breaching the act.

Conclusion

Private AI empowers organizations to meet the ARHSSI's rigorous de-identification standards and much more by providing real-time, automated solutions that protect sensitive data and minimize compliance risk. With Quebec leading the way in comprehensive privacy legislation, now is the time to ensure that your data practices are both compliant and future ready.

To see the tech in action, try our web demo, or get an API key to try it yourself on your own data.

Data Left Behind: AI Scribes’ Promises in Healthcare

Data Left Behind: Healthcare’s Untapped Goldmine

The Future of Health Data: How New Tech is Changing the Game

Why is linguistics essential when dealing with healthcare data?

Why Health Data Strategies Fail Before They Start

Private AI to Redefine Enterprise Data Privacy and Compliance with NVIDIA

EDPB’s Pseudonymization Guideline and the Challenge of Unstructured Data

HHS’ proposed HIPAA Amendment to Strengthen Cybersecurity in Healthcare and how Private AI can Support Compliance

Japan's Health Data Anonymization Act: Enabling Large-Scale Health Research

What the International AI Safety Report 2025 has to say about Privacy Risks from General Purpose AI

Private AI 4.0: Your Data’s Potential, Protected and Unlocked

How Private AI Facilitates GDPR Compliance for AI Models: Insights from the EDPB's Latest Opinion

Navigating the New Frontier of Data Privacy: Protecting Confidential Company Information in the Age of AI

Belgium’s Data Protection Authority on the Interplay of the EU AI Act and the GDPR

Enhancing Compliance with US Privacy Regulations for the Insurance Industry Using Private AI

Navigating Compliance with Quebec’s Act Respecting Health and Social Services Information Through Private AI’s De-identification Technology

Unlocking New Levels of Accuracy in Privacy-Preserving AI with Co-Reference Resolution

Strengthened Data Protection Enforcement on the Horizon in Japan

How Private AI Can Help to Comply with Thailand's PDPA

How Private AI Can Help Financial Institutions Comply with OSFI Guidelines

The American Privacy Rights Act – The Next Generation of Privacy Laws

How Private AI Can Help with Compliance under China’s Personal Information Protection Law (PIPL)

PII Redaction for Reviews Data: Ensuring Privacy Compliance when Using Review APIs

Independent Review Certifies Private AI’s PII Identification Model as Secure and Reliable

To Use or Not to Use AI: A Delicate Balance Between Productivity and Privacy

To Use or Not to Use AI: A Delicate Balance Between Productivity and Privacy

News from NIST: Dioptra, AI Risk Management Framework (AI RMF) Generative AI Profile, and How PII Identification and Redaction can Support Suggested Best Practices

Handling Personal Information by Financial Institutions in Japan – The Strict Requirements of the FSA Guidelines

日本における金融機関の個人情報の取り扱い - 金融庁ガイドラインの要件

Leveraging Private AI to Meet the EDPB’s AI Audit Checklist for GDPR-Compliant AI Systems

Who is Responsible for Protecting PII?

How Private AI can help the Public Sector to Comply with the Strengthening Cyber Security and Building Trust in the Public Sector Act, 2024

A Comparison of the Approaches to Generative AI in Japan and China

Updated OECD AI Principles to keep up with novel and increased risks from general purpose and generative AI

Is Consent Required for Processing Personal Data via LLMs?

The evolving landscape of data privacy legislation in healthcare in Germany

The CIO’s and CISO’s Guide for Proactive Reporting and DLP with Private AI and Elastic

The Evolving Landscape of Health Data Protection Laws in the United States

Comparing Privacy and Safety Concerns Around Llama 2, GPT4, and Gemini

How to Safely Redact PII from Segment Events using Destination Insert Functions and Private AI API

WHO’s AI Ethics and Governance Guidance for Large Multi-Modal Models operating in the Health Sector – Data Protection Considerations

How to Protect Confidential Corporate Information in the ChatGPT Era

Unlocking the Power of Retrieval Augmented Generation with Added Privacy: A Comprehensive Guide

Leveraging ChatGPT and other AI Tools for Legal Services

Leveraging ChatGPT and other AI tools for HR

Leveraging ChatGPT in the Banking Industry

Law 25 and Data Transfers Outside of Quebec

The Colorado and Connecticut Data Privacy Acts

Unlocking Compliance with the Japanese Data Privacy Act (APPI) using Private AI

Tokenization and Its Benefits for Data Protection

Private AI Launches Cloud API to Streamline Data Privacy

Processing of Special Categories of Data in Germany

End-to-end Privacy Management

Privacy Breach Reporting Requirements under Law25

Migrating Your Privacy Workflows from Amazon Comprehend to Private AI

A Comparison of the Approaches to Generative AI in the US and EU

Benefits of AI in Healthcare and Data Sources (Part 1)

Privacy Attacks against Data and AI Models (Part 3)

Risks of Noncompliance and Challenges around Privacy-Preserving Techniques (Part 2)

Enhancing Data Lake Security: A Guide to PII Scanning in S3 buckets

The Costs of a Data Breach in the Healthcare Sector and its Privacy Compliance Implications

Navigating GDPR Compliance in the Life Cycle of LLM-Based Solutions

What’s New in Version 3.8

How to Protect Your Business from Data Leaks: Lessons from Toyota and the Department of Home Affairs

New York's Acceptable Use of AI Policy: A Focus on Privacy Obligations

Safeguarding Personal Data in Sentiment Analysis: A Guide to PII Anonymization

Changes to South Korea’s Personal Information Protection Act to Take Effect on March 15, 2024

Australia’s Plan to Regulate High-Risk AI

How Private AI can help comply with the EU AI Act

Comment la Loi 25 Impacte l'Utilisation de ChatGPT et de l'IA en Général

Endgültiger Entwurf des Gesetzes über Künstliche Intelligenz – Datenschutzpflichten der KI-Modelle mit Allgemeinem Verwendungszweck

How Law25 Impacts the Use of ChatGPT and AI in General

Is Salesforce Law25 Compliant?

Creating De-Identified Embeddings

Exciting Updates in 3.7

EU AI Act Final Draft – Obligations of General-Purpose AI Systems relating to Data Privacy

FTC Privacy Enforcement Actions Against AI Companies

The CCPA, CPRA, and California's Evolving Data Protection Landscape

HIPAA Compliance – Expert Determination Aided by Private AI

Private AI Software As a Service Agreement

EU's Review of Canada's Data Protection Adequacy: Implications for Ongoing Privacy Reform

Acceptable Use Policy

ISO/IEC 42001: A New Standard for Ethical and Responsible AI Management

Reviewing OpenAI's 31st Jan 2024 Privacy and Business Terms Updates

Comparing OpenAI vs. Azure OpenAI Services

Quebec’s Draft Regulation Respecting the Anonymization of Personal Information

Version 3.6 Release: Enhanced Streaming, Auto Model Selection, and More in Our Data Privacy Platform

Brazil's LGPD: Anonymization, Pseudonymization, and Access Requests

LGPD do Brasil: Anonimização, Pseudonimização e Solicitações de Acesso à Informação

Canada’s Principles for Responsible, Trustworthy and Privacy-Protective Generative AI Technologies and How to Comply Using Private AI

Private AI Named One of The Most Innovative RegTech Companies by RegTech100

Data Integrity, Data Security, and the New NIST Cybersecurity Framework

Safeguarding Privacy with Commercial LLMs

Cybersecurity in the Public Sector: Protecting Vital Services

Privacy Impact Assessment (PIA) Requirements under Law25

Elevate Your Experience with Version 3.5

Fine-Tuning LLMs with a Focus on Privacy

GDPR in Germany: Challenges of German Data Privacy (Part 2)

Comply with US Executive Order on Safe, Secure, and Trustworthy Artificial Intelligence using Private AI

How to Comply with EU AI Act using PrivateGPT