The Costs of a Data Breach in the Healthcare Sector and its Privacy Compliance Implications

Kathrin Gardhouse
Apr 5, 2024
Share this post
Sharing to FacebookSharing to LinkedInSharing to XSharing to Email

About a year ago we covered the 2022 Cost of a Data Breach Report by IBM. The 2022 report had surfaced that the cost was highest in the healthcare and financial industry, followed by pharmaceuticals and technology. In 2023 this was equally so as the latest report shows. In fact, the cost in the healthcare sector rose again from an average of USD 10.10 million in 2022 to USD 10.93 million in 2023. Over the past 3 years, this number has increased by a whopping 53.3 percent.

In addition, the number of breaches recorded in the healthcare industry has more than doubled since 2017, the HIPAA Journal reports in its December 2023 Healthcare Data Breach Report.

This article covers the effects of data breaches on the healthcare sector, the value of health data, the cost of noncompliance with data protection laws, and how companies can avoid it.

The Devastating Effects of Data Breaches in the Health Sector

Data breaches in the healthcare sector can have devastating effects. Even the bad guys know it. Only in December of 2022, the then-largest ransomware operation LockBit which operates a Ransomware as a Service model apologized to Sick Kids Hospital in Toronto, Canada for its affiliate breaking the rules LockBit imposes as conditions for the use of its services and attacking healthcare service providers using LockBits ransomware encryption variant. Of course this does not mean that healthcare as an industry is off-limits. In November 2023, a LockBit-affiliated threat actor stated during the attack of a US healthcare service provider: "We purposely didn't encrypt this hospital so as not to interfere with patient care. We just stole over 10 million files.

"Then in February 2024 the arguably worst ransomware attack on the US healthcare sector ever hit UnitedHealth subsidiary Change Healthcare and exposed how fragile the healthcare system is. For context, Change Healthcare is the largest clearinghouse for insurance billing and payments in the States with thousands of healthcare providers depending on their system to obtain insurance approval for any prescribed services, including drugs, and to get paid for them. So it’s not that the hackers directly disabled the provision of healthcare services but they put the pressure on to make Change Healthcare meet their ransom demands by causing healthcare providers to scramble for funds to keep their doors open.

The Value of Health Data

For this reason, it is obvious why healthcare service providers are such a popular target for hackers: they are likely willing to pay as the stakes are particularly high with the lives of patients quite literally on the line.

But a second, less obvious reason is that health records bring the biggest bang for buck on the black market. Reportedly, medical records sell for USD 60 while a Social Security number only brings in USD 15 and USD 3 can be charged for a credit card. Why are health records so valuable? It’s because they have a long lifespan compared to a credit card number or other financial data, misuse is harder to detect, they allow for impersonation to obtain prescription drugs, commit tax fraud, phishing attacks, extortion, blackmail and more.So far it remains unclear whether Protected Health Information (PHI) has been affected by the Change Healthcare breach and if so how. The HHS Office for Civil Rights (OCR), the entity tasked with enforcing the Health Insurance Portability and Accountability Act’s (HIPAA) security, privacy, and breach notification rules, has launched an investigation to determine whether Change Healthcare has been HIPAA compliant and whether PHI was breached.

The Cost of Noncompliance

Non-compliance is an important cost factor, driving up the total cost of a data breach in highly regulated industries including healthcare by 23 percent or USD 1.03 million compared to industries subject to few or no regulations. This number is down from 50.9 percent in 2022. But non-compliance remains the third most impactful cost amplifier, according to the 2023 IBM Report, which was the same in 2022. The first two are security skill shortage and security system complexity.

Effective Cost Mitigation

The wisdom these days dictates that it is not a question of whether but when a security breach of an organization will occur. Preventing a breach is of course the best strategy, but since that is unlikely to be successful against all attacks ever launched against and organization, efforts should also be undertaken to shorten the data breach lifecycle (the time from discovery to resolution of a breach) and to ensure compliance with data protection laws and regulations.The 2023 Cost of a Data Breach Report recommends the following top three strategies:

  • Build security into every stage of software development and deployment—and test regular
  • Modernize data protection across hybrid cloud
  • Use security AI and automation to increase speed and accuracy
  • Strengthen resiliency by knowing your attack surface and practicing incident response

How We Can Help

Focusing on the first recommendation, here is how Private AI can help make software development and deployment safer. If you are using vast amounts of data to develop software components such as algorithms best practice and data protection laws and regulation dictate that you only include the minimum amount of personal data necessary to achieve your goal. Private AI can redact and replace personally identifiable information in unstructured datasets with great accuracy. The safest data to use is the one that does not contain valuable data at all and the most expensive one is personal data including health records.

2023 Cost of a Data Breach Report by IBM

Following the fourth recommendation on how to save costs in the context of data breaches, knowing your attack surface and practicing incident response, can also be facilitated using Private AI’s tech. First of all, when you can easily determine where in your systems the largest amount and the most valuable information is located, you know where hackers are more likely to attack. Private AI has the ability to produce a precise report indicating the location and type of personal data in your systems, helping you make this determination particularly when unstructured data is concerned.

Having the ability to determine where your valuable data is located and what it includes exactly is also a must for incident response strategies, one aspect of which is breach reporting.

Private AI can be used to produce a precise report indicating the location and type of personal data in the data affected by the breach, which can save a considerable amount of time. This is particularly important in cases where there are tight deadlines for reporting data breaches. The GDPR requires reporting “without undue delay and, where feasible, not later than 72 hours after having become aware of it.” Under HIPAA, reporting of breaches affecting 500 or more individuals must occur “without unreasonable delay and in no case later than 60 days following a breach.”

Given that in 2023 it took organizations an average of 73 days to contain a data breach resulting from stolen or compromised credentials according to the 2023 IBM Report even a 60-day reporting period can seem short, given the competing demands on everyone’s time during data breach incidents. Having tools to aid with the issuing of the report can be instrumental in avoiding significant fines.

Conclusion

In conclusion, the healthcare sector faces mounting challenges from escalating data breach costs and their profound implications. According to the 2023 Cost of a Data Breach Report, breach-related expenses in healthcare surged by 53.3 percent over three years, reaching an average of USD 10.93 million. The frequency of breaches has more than doubled since 2017, underscoring the urgency of addressing this critical issue. Beyond financial losses, breaches jeopardize patient safety and privacy, as seen in recent ransomware attacks on healthcare institutions.

Compliance with data protection laws is crucial, given the high value of health data on the black market. Noncompliance remains costly, driving up breach expenses and emphasizing the need for proactive risk mitigation measures. By implementing strategies outlined in the report, such as integrating security into software development and leveraging AI and automation, organizations can strengthen their defenses. Solutions like those offered by Private AI, which facilitate data redaction and streamline incident response, play a vital role in minimizing breach impact. Collaboration between industry stakeholders and innovative technologies will be essential in safeguarding patient data and mitigating the devastating consequences of breaches in the healthcare sector. Try our web demo to see for yourself, or talk to an expert today.

Data Left Behind: AI Scribes’ Promises in Healthcare

Data Left Behind: Healthcare’s Untapped Goldmine

The Future of Health Data: How New Tech is Changing the Game

Why is linguistics essential when dealing with healthcare data?

Why Health Data Strategies Fail Before They Start

Private AI to Redefine Enterprise Data Privacy and Compliance with NVIDIA

EDPB’s Pseudonymization Guideline and the Challenge of Unstructured Data

HHS’ proposed HIPAA Amendment to Strengthen Cybersecurity in Healthcare and how Private AI can Support Compliance

Japan's Health Data Anonymization Act: Enabling Large-Scale Health Research

What the International AI Safety Report 2025 has to say about Privacy Risks from General Purpose AI

Private AI 4.0: Your Data’s Potential, Protected and Unlocked

How Private AI Facilitates GDPR Compliance for AI Models: Insights from the EDPB's Latest Opinion

Navigating the New Frontier of Data Privacy: Protecting Confidential Company Information in the Age of AI

Belgium’s Data Protection Authority on the Interplay of the EU AI Act and the GDPR

Enhancing Compliance with US Privacy Regulations for the Insurance Industry Using Private AI

Navigating Compliance with Quebec’s Act Respecting Health and Social Services Information Through Private AI’s De-identification Technology

Unlocking New Levels of Accuracy in Privacy-Preserving AI with Co-Reference Resolution

Strengthened Data Protection Enforcement on the Horizon in Japan

How Private AI Can Help to Comply with Thailand's PDPA

How Private AI Can Help Financial Institutions Comply with OSFI Guidelines

The American Privacy Rights Act – The Next Generation of Privacy Laws

How Private AI Can Help with Compliance under China’s Personal Information Protection Law (PIPL)

PII Redaction for Reviews Data: Ensuring Privacy Compliance when Using Review APIs

Independent Review Certifies Private AI’s PII Identification Model as Secure and Reliable

To Use or Not to Use AI: A Delicate Balance Between Productivity and Privacy

To Use or Not to Use AI: A Delicate Balance Between Productivity and Privacy

News from NIST: Dioptra, AI Risk Management Framework (AI RMF) Generative AI Profile, and How PII Identification and Redaction can Support Suggested Best Practices

Handling Personal Information by Financial Institutions in Japan – The Strict Requirements of the FSA Guidelines

日本における金融機関の個人情報の取り扱い - 金融庁ガイドラインの要件

Leveraging Private AI to Meet the EDPB’s AI Audit Checklist for GDPR-Compliant AI Systems

Who is Responsible for Protecting PII?

How Private AI can help the Public Sector to Comply with the Strengthening Cyber Security and Building Trust in the Public Sector Act, 2024

A Comparison of the Approaches to Generative AI in Japan and China

Updated OECD AI Principles to keep up with novel and increased risks from general purpose and generative AI

Is Consent Required for Processing Personal Data via LLMs?

The evolving landscape of data privacy legislation in healthcare in Germany

The CIO’s and CISO’s Guide for Proactive Reporting and DLP with Private AI and Elastic

The Evolving Landscape of Health Data Protection Laws in the United States

Comparing Privacy and Safety Concerns Around Llama 2, GPT4, and Gemini

How to Safely Redact PII from Segment Events using Destination Insert Functions and Private AI API

WHO’s AI Ethics and Governance Guidance for Large Multi-Modal Models operating in the Health Sector – Data Protection Considerations

How to Protect Confidential Corporate Information in the ChatGPT Era

Unlocking the Power of Retrieval Augmented Generation with Added Privacy: A Comprehensive Guide

Leveraging ChatGPT and other AI Tools for Legal Services

Leveraging ChatGPT and other AI tools for HR

Leveraging ChatGPT in the Banking Industry

Law 25 and Data Transfers Outside of Quebec

The Colorado and Connecticut Data Privacy Acts

Unlocking Compliance with the Japanese Data Privacy Act (APPI) using Private AI

Tokenization and Its Benefits for Data Protection

Private AI Launches Cloud API to Streamline Data Privacy

Processing of Special Categories of Data in Germany

End-to-end Privacy Management

Privacy Breach Reporting Requirements under Law25

Migrating Your Privacy Workflows from Amazon Comprehend to Private AI

A Comparison of the Approaches to Generative AI in the US and EU

Benefits of AI in Healthcare and Data Sources (Part 1)

Privacy Attacks against Data and AI Models (Part 3)

Risks of Noncompliance and Challenges around Privacy-Preserving Techniques (Part 2)

Enhancing Data Lake Security: A Guide to PII Scanning in S3 buckets

The Costs of a Data Breach in the Healthcare Sector and its Privacy Compliance Implications

Navigating GDPR Compliance in the Life Cycle of LLM-Based Solutions

What’s New in Version 3.8

How to Protect Your Business from Data Leaks: Lessons from Toyota and the Department of Home Affairs

New York's Acceptable Use of AI Policy: A Focus on Privacy Obligations

Safeguarding Personal Data in Sentiment Analysis: A Guide to PII Anonymization

Changes to South Korea’s Personal Information Protection Act to Take Effect on March 15, 2024

Australia’s Plan to Regulate High-Risk AI

How Private AI can help comply with the EU AI Act

Comment la Loi 25 Impacte l'Utilisation de ChatGPT et de l'IA en Général

Endgültiger Entwurf des Gesetzes über Künstliche Intelligenz – Datenschutzpflichten der KI-Modelle mit Allgemeinem Verwendungszweck

How Law25 Impacts the Use of ChatGPT and AI in General

Is Salesforce Law25 Compliant?

Creating De-Identified Embeddings

Exciting Updates in 3.7

EU AI Act Final Draft – Obligations of General-Purpose AI Systems relating to Data Privacy

FTC Privacy Enforcement Actions Against AI Companies

The CCPA, CPRA, and California's Evolving Data Protection Landscape

HIPAA Compliance – Expert Determination Aided by Private AI

Private AI Software As a Service Agreement

EU's Review of Canada's Data Protection Adequacy: Implications for Ongoing Privacy Reform

Acceptable Use Policy

ISO/IEC 42001: A New Standard for Ethical and Responsible AI Management

Reviewing OpenAI's 31st Jan 2024 Privacy and Business Terms Updates

Comparing OpenAI vs. Azure OpenAI Services

Quebec’s Draft Regulation Respecting the Anonymization of Personal Information

Version 3.6 Release: Enhanced Streaming, Auto Model Selection, and More in Our Data Privacy Platform

Brazil's LGPD: Anonymization, Pseudonymization, and Access Requests

LGPD do Brasil: Anonimização, Pseudonimização e Solicitações de Acesso à Informação

Canada’s Principles for Responsible, Trustworthy and Privacy-Protective Generative AI Technologies and How to Comply Using Private AI

Private AI Named One of The Most Innovative RegTech Companies by RegTech100

Data Integrity, Data Security, and the New NIST Cybersecurity Framework

Safeguarding Privacy with Commercial LLMs

Cybersecurity in the Public Sector: Protecting Vital Services

Privacy Impact Assessment (PIA) Requirements under Law25

Elevate Your Experience with Version 3.5

Fine-Tuning LLMs with a Focus on Privacy

GDPR in Germany: Challenges of German Data Privacy (Part 2)

Comply with US Executive Order on Safe, Secure, and Trustworthy Artificial Intelligence using Private AI

How to Comply with EU AI Act using PrivateGPT