Blog

Sometimes we take a break from building cutting edge AI redaction models to stretch our academic muscles and write about privacy and machine learning.

Don’t miss the latest on AI and privacy. Join our newsletter

Data Left Behind: AI Scribes’ Promises in Healthcare

We’ve talked a lot about how technology is transforming healthcare. From ambient listening devices to voice-based assistants, we’re seeing a data explosion. This shift (what many call a “data revolution”) is especially visible in the rise of AI scribes: tools that automatically generate clinical notes from doctor-patient conversations.The promise is big: faster diagnoses, more personalized care, and a lighter admin load for clinicians. But there’s a catch: most of this data never gets used.

Data Left Behind: Healthcare’s Untapped Goldmine

We discussed how new technology is transforming healthcare: As the volume of electronic data continues to increase, many sectors refer to this phenomenon as a data revolution. And though this revolution promises faster diagnoses and personalized care, it comes with a catch: most of that data is never even used.Welcome to healthcare’s quiet crisis: the abandonment of unstructured data.

The Future of Health Data: How New Tech is Changing the Game

The way healthcare organizations collect and use critical data is changing, and changing fast. From smartwatches to AI-powered documentation, new technologies are transforming how patient information flows, creating new opportunities for continuous health monitoring, early intervention, and improved clinical outcomes. But with so many tools entering the space, it can be hard to keep up.

Why is linguistics essential when dealing with healthcare data?

Clinical notes. Imaging reports. Lab results. Transcripts from patient conversations. It’s all full of critical insights, but none of it fits neatly into rows and columns – it’s all unstructured data.

Why Health Data Strategies Fail Before They Start

We’ve said it before, and we’ll say it again: healthcare data has the power to transform care. It can personalize treatments and speed up diagnoses in ways we’ve only dreamed of.But here’s the part nobody really likes to talk about: most healthcare data strategies fail before they even get off the ground.

Private AI to Redefine Enterprise Data Privacy and Compliance with NVIDIA

Toronto, Canada – [February 20, 2025] – Private AI, a leader in privacy-preserving artificial intelligence, is proud to announce its integration with NVIDIA NeMo Guardrails to bring advanced privacy and compliance capabilities to enterprises leveraging large language models (LLMs), enabling them to unlock the potential of generative AI while safeguarding sensitive data.

EDPB’s Pseudonymization Guideline and the Challenge of Unstructured Data

The European Data Protection Board (EDPB) recently released its comprehensive Guidelines 01/2025 on Pseudonymisation, a document rich with practical insights into the application of pseudonymisation under the General Data Protection Regulation (GDPR).

HHS’ proposed HIPAA Amendment to Strengthen Cybersecurity in Healthcare and how Private AI can Support Compliance

On December 27, 2024, the U.S. Department of Health and Human Services (HHS), through its Office for Civil Rights (OCR), issued a proposed rule to enhance the cybersecurity measures required under the HIPAA Security Rule. This Notice of Proposed Rulemaking (NPRM) seeks to bolster the defenses of the U.S. healthcare system against the rising tide of cyberattacks, particularly those targeting electronic protected health information (ePHI). The changes aim to address critical weaknesses, clarify obligations, and align the Security Rule with modern cybersecurity practices.

Japan's Health Data Anonymization Act: Enabling Large-Scale Health Research

Anonymized and pseudonymized medical data are at the heart of cutting-edge research and innovation in healthcare. By stripping away personal identifiers and adding additional privacy-preserving measures, these data allow for advanced studies without compromising the privacy of individuals.In Japan, as elsewhere, the path to leveraging this valuable resource has been complex due to the need to balance large-scale data use with privacy protection. Thus, under the Act on the Protection of Personal Information (APPI) healthcare providers have faced challenges in sharing and processing medical data for research and innovation purposes, primarily due to strict consent requirements.

What the International AI Safety Report 2025 has to say about Privacy Risks from General Purpose AI

As the world gears up for the AI Action Summit in Paris in February 2025, global policymakers, researchers, and industry leaders are turning their attention to a landmark publication: The International AI Safety Report 2025. This report, a collaborative effort by 96 AI experts from around the world, represents the most comprehensive scientific assessment to date of the risks posed by general-purpose AI—a rapidly advancing form of AI with the ability to perform a wide range of tasks.

Private AI 4.0: Your Data’s Potential, Protected and Unlocked

I’m thrilled to announce Private AI 4.0. This release redefines how healthcare businesses harness their unstructured data while maintaining fine-grained privacy controls and the highest standards of compliance. With Private AI 4.0, we are turning challenges into opportunities by enabling analytics confidently and securely to improve patient outcomes, accelerate research, and drive operational efficiency.

How Private AI Facilitates GDPR Compliance for AI Models: Insights from the EDPB's Latest Opinion

The European Data Protection Board (EDPB) has recently provided critical guidance on ensuring GDPR compliance during the development and deployment of AI models.

Navigating the New Frontier of Data Privacy: Protecting Confidential Company Information in the Age of AI

Artificial intelligence and large language models (LLMs) are transforming the way we work, and the boundaries of data privacy are being tested like never before. While most organizations have measures in place to secure their proprietary algorithms and internal data repositories, there’s a growing challenge that can’t be overlooked: the uncontrolled sharing of Confidential Company Information (CCI) by users interacting with third-party AI systems.

Enhancing Compliance with US Privacy Regulations for the Insurance Industry Using Private AI

The US insurance industry operates under a complex landscape of privacy laws and regulations designed to protect consumers’ personal information. At the heart of this regulatory framework are standards developed by the National Association of Insurance Commissioners (NAIC), alongside federal and state laws like the Gramm-Leach-Bliley Act (GLBA).

Belgium’s Data Protection Authority on the Interplay of the EU AI Act and the GDPR

The Belgium Data Protection Authority’s recent report, Artificial Intelligence Systems and the GDPR: A Data Protection Perspective, is a timely analysis exploring the interplay of the General Data Protection Regulation (GDPR) and the EU Artificial Intelligence (AI) Act.

Navigating Compliance with Quebec’s Act Respecting Health and Social Services Information Through Private AI’s De-identification Technology

Quebec’s new Act Respecting Health and Social Services Information (ARHSSI) introduces a notable tightening of data privacy requirements within the province, with a distinct emphasis on safeguarding health and social services information.

Unlocking New Levels of Accuracy in Privacy-Preserving AI with Co-Reference Resolution

In the fast-evolving world of AI, where data is at the core of decision-making, one challenge remains constant: ensuring appropriate privacy and protection of an individual’s and an organization’s data. At Private AI, we’ve been working on solving this challenge with the introduction of Coreference Resolution as part of our 4.0alpha release.

Strengthened Data Protection Enforcement on the Horizon in Japan

We previously wrote about Unlocking Compliance with the Japanese Data Privacy Act (APPI) using Private AI and are now following up with an assessment of the Interim Report on Considerations for the Triennial Review of the Act on Protection of Personal Information (Interim Report).