Skip to the main content.
Talk to sales Start for free
Talk to sales Start for free

3 min read

Will OpenAI sign a BAA?

Will OpenAI sign a BAA?

OpenAI, a leading provider of artificial intelligence (AI) language models, has revolutionized how businesses operate. However, healthcare providers handling protected health information (PHI) must comply with the Health Insurance Portability and Accountability Act (HIPAA) regulations, which require covered entities to sign a BAA with vendors.

So, will OpenAI sign a Business Associate Agreement (BAA) with healthcare organizations to ensure their use of OpenAI's services is HIPAA compliant?

 

What is a BAA?

Business Associate Agreement (BAA) is a legally binding agreement between a covered entity, such as a healthcare provider, and a business associate who handles PHI on their behalf. The BAA outlines the terms and conditions for the use and disclosure of PHI, as well as the security and privacy obligations of the business associate.

 

Why it matters

HIPAA regulations require that covered entities only share PHI with business associates who have signed a BAA. This ensures that PHI is protected and that all parties comply with HIPAA regulations. As a result, healthcare organizations may be hesitant to use OpenAI's services without a BAA in place.

 

Does OpenAI sign a BAA?

At this time, it appears that OpenAI does not sign a BAA. Therefore they may not be HIPAA compliant.

 

OpenAI does take steps to protect the privacy and security of user data. 

  • When users interact with OpenAI, their data is encrypted both in transit and at rest. 
  • OpenAI does not collect or store any user data without explicit consent. 
  • Additionally, all interactions with OpenAI are confidential and protected by the terms of the OpenAI API access agreement.

 

OpenAI terms state, "If you will be using the OpenAI API for the processing of "personal data" as defined in the GDPR or "Personal Information" as defined in CCPA, please... request to execute our Data Processing Addendum."

 

How do healthcare companies like Nuance use OpenAI and comply with HIPAA without a BAA?

Nuance, a company that provides speech recognition and natural language processing solutions for the healthcare industry, has partnered with OpenAI. Nuance uses OpenAI's language models to improve the accuracy and efficiency of their medical transcription, clinical documentation, and virtual assistant solutions.

Nuance uses a combination of their own AI models and OpenAI's language models to enhance their healthcare solutions, including their medical transcription, clinical documentation, and virtual assistant offerings.

They don't require a BAA because they do not share any PHI with OpenAI. Nuance uses OpenAI's language models to enhance its healthcare solutions but does not directly use PHI with OpenAI's services. Instead, they use de-identified or anonymized data to train and improve the performance of their AI models.

RelatedHow to send HIPAA compliant emails

 

Protecting PHI: Privacy Concerns & Risks

If you mistakenly input private PHI into ChatGPT, it’s unlikely that it could show up in answer to another user, but it is (very slightly) possible.

As a precaution, healthcare professionals and users interacting with ChatGPT in a healthcare context are encouraged to avoid sharing sensitive information to minimize any potential privacy risks.

 

It actually depends on the account

Yaniv Markovski, Head of AI Specialist at OpenAI, said, “OpenAI does not use data submitted by customers via our API to train OpenAI models or improve OpenAI’s service offering… When you use our non-API consumer services ChatGPT or DALL-E, we may use the data you provide us to improve our models.”

Related:  Safeguarding PHI in ChatGPT 

 

How can healthcare organizations use OpenAI in a HIPAA compliant manner?

Healthcare organizations can use OpenAI's services in a HIPAA compliant manner by implementing appropriate security controls and policies. 

This includes the following steps:

 

1. Conduct a risk assessment

Perform a risk assessment to identify potential risks and vulnerabilities associated with using OpenAI's services. Analyze potential threats to PHI, such as unauthorized access, data breaches, or data loss.

 

2. Protect PHI in text input

Healthcare professionals should ensure that the text input does not include identifiable patient information. This can be achieved through de-identification techniques such as masking or tokenization.

 

3. Monitor chat logs

Chat logs should be monitored and reviewed regularly to ensure they do not contain any PHI. Use automated tools to detect and redact any PHI that may be present in chat logs. 

 

4. Set up access controls 

Access to chat logs should be restricted to only those who need it to perform their job functions. 

 

5. Establish policies and procedures

Develop policies and procedures that govern using OpenAI's services to protect PHI. These should address issues such as data access and data retention.

 

6. Have an incident response policy

Healthcare organizations should have an incident response plan outlining procedures for responding to a security incident involving ChatGPT.

The plan should include:

  • Procedures for identifying and containing the incident.
  • Notifying affected individuals.
  • Investigating the cause of the incident.

 

7. Train staff

Train staff on the proper use of OpenAI's services to ensure that PHI is protected and not entered into ChatGPT. This should include training on security best practices, data privacy, and incident reporting.

 

8. Do regular audits

Healthcare organizations should regularly audit their ChatGPT usage to identify potential security issues or compliance gaps. Audits should be conducted by an independent third party to ensure objectivity.

 

9. Conduct due diligence

Conduct due diligence when selecting an OpenAI vendor. Confirm the vendor has appropriate security controls and is committed to protecting PHI. 

 

10. Manage vendors

Healthcare organizations should ensure that ChatGPT vendors meet their security and privacy requirements, including HIPAA compliance. Request that the vendor provide a business associate agreement that includes proper security and privacy terms. Ensure that the vendor undergoes regular security audits and assessments.

 

Prioritize privacy

While OpenAI does not sign a BAA, healthcare organizations can still use their services in a HIPAA compliant manner by taking appropriate security measures and following best practices. By prioritizing privacy and security, healthcare providers can benefit from the power of OpenAI's language models while protecting PHI and maintaining compliance with applicable regulations.

RelatedHIPAA Compliant Email: The Definitive Guide

 

 

Subscribe to Paubox Weekly

Every Friday we'll bring you the most important news from Paubox. Our aim is to make you smarter, faster.