AI, Security, and HIPAA: A Healthcare Leader’s Guide to Compliance
- With AI and LLMs, it’s time to reassess security and the framework protections for HIPAA.
- You need an ethics and AI review board.
- Use Business Associate Agreements (BAA) to protect your patients from third-party vendors.
AI, Security, and HIPAA: A Healthcare Leader’s Guide to Compliance
As a healthcare provider, you know that your practice revolves around the Health Insurance Portability and Accountability Act of 1996 (HIPAA). Over the years, you’ve adapted to the requirements of HIPAA and added layers of security to protect your patients. However, AI is changing the landscape, and you want to ensure that you continue to have extensive security to protect your patients and maintain compliance with HIPAA.
Understanding the New Threat Landscape
With AI, new threats and vulnerabilities are being revealed, and new solutions must be found. In many cases, there are issues that the current firewalls can’t catch. Some of these new risks include:
- Data Poisoning: The data you use for training can be compromised and cause issues with the model’s integrity.
- Model Inversion: You run the risk of “reverse-engineering” a model to extract sensitive training data (PHI).
- Privacy Leaks: The danger of Large Language Models (LLMs) “remembering” and inadvertently sharing patient details.
A Strategic Framework for Healthcare AI Governance
You can use AI for healthcare governance and remain compliant with HIPAA regulations. To start, you need to create a strategic framework. This needs to include:
Principle 1: Establish an AI Ethics & Review Board
You need someone or an established group of people who oversee AI usage in your healthcare facility and ensure it’s being used appropriately. You also need to know that it isn’t sharing patient information. It should be a cross-functional oversight committee. AI is used in many areas of your business, and you want each department to have a hand in oversight, including clinical, legal, and IT stakeholders, to evaluate the “why” and “how” of every AI implementation.
Principle 2: Ensure Data Provenance and Anonymization
Create and maintain rigorous standards for where your data comes from. You want de-identification to happen before it becomes a part of the training area for AI. Knowing the origin of the data ensures that incorrect information doesn’t become part of the learning by your AI.
Principle 3: Implement “Explainable AI” (XAI)
People are moving away from “Black Box” models, where you have transparency for information being used and a conclusion reached, but no idea of the logic used to reach the conclusion. More and more clinical settings want to know how and why the AI has reached a particular conclusion to make sure it’s safe, and the system is accountable for any issues.
Principle 4: Rigorous Third-Party Vendor Vetting
You have to have third-party vendors to get supplies and services that your healthcare facility needs to thrive. However, they can introduce AI modeling issues into your own framework. You need miminize these risks with a Business Associate Agreements (BAA) that specifically address AI data usage and model training rights. You want these agreements in place before you start doing business together.
AI and HIPAA: Navigating Compliance in the Age of LLMs
With AI and HIPAA, we need a new model of business as usual. It needs to be better than it has been in the past. While the tech is new, HIPAA still applies, and you’re responsible for that security. Public vs. private LLMs bring two major challenges. These are data leakages and inputting sensitive data, such as pasting a person’s name, age, and other information into a chat.
Governance as an Innovation Accelerator
When you’re ready to start working on AI projects, an AI framework allows you to take that step into the future. With security, your adoption of AI is sure to follow. You walk a fine line between “Cutting Edge” and “Compliant.” HPG stands ready to guide you as you bridge this gap. Ready to build your strategy? See how this applies to your core platforms: Contact us now.
Image Credit: PeopleImages/ Shutterstock



