Recent updates to Open AI privacy policies, terms of use applicable to individuals, and business terms have sparked the interest of the privacy and business community. The former two are coming into effect on Jan 31, 2024, except insofar as the Europe privacy policy is concerned, and the business terms are already in force as of Nov 14, 2023. As we delve into these changes to inform these discussions, it’s crucial to understand their implications, particularly for individual users in Europe and businesses operating under HIPAA regulations, i.e., so-called covered entities which include Health Plans, including health insurance companies, HMOs, company health plans, and certain government programs that pay for health care, such as Medicare and Medicaid. Here’s an exploration of what these updates entail and how they impact the use of AI-driven solutions in different contexts.
Unpacking the Europe-Specific Privacy Policy
One of the most significant changes is the introduction of a Europe-specific privacy policy, which comes into force on Feb 15, 2024 . This move aligns with the stringent data protection standards set by the GDPR. The novelties include the ability of individual users to exercise their privacy-related rights through their OpenAI account, a crucial step towards enhancing user control over personal data. The European privacy policy now also sets out detailed legal bases for processing personal data, explicitly stated for the first time. This includes processing for contract performance, legitimate interests, and user consent, aligning with GDPR’s requirements.
Data transfer mechanisms receive particular attention, acknowledging the complexities of transferring personal data outside the EEA, Switzerland, and the UK. The policy adheres to GDPR standards, utilizing European Commission’s adequacy decisions and Standard Contractual Clauses as safeguards.
This development, however, is specifically tailored for individual users and does not extend to business offerings. For businesses operating in Europe, understanding the nuances of the business terms will be key to ensuring compliance and maintaining user trust.
Business Terms
In the realm of business terms, not much has changed. It’s noteworthy that the HIPAA restriction remains largely unchanged, i.e., the prohibition to include personal health information (PHI) in user input without a business associate agreement (BAA) in place, a requirement under HIPAA.
However, a subtle shift is observed in the privacy language applicable to businesses. Formerly, reference was made to “API data usage policies,” which changed to “Enterprise Privacy Commitments.”
Unfortunately, in contrast to the business terms, OpenAI did not make the previous version (i.e., the API data usage policies) available for comparison. Hence, we are left to assume that this shift is intended to signal an enhanced focus on comprehensive privacy practices that go beyond just API interactions, without being able to confirm what that means in substance.
The Challenges of Securing a Business Associate Agreement (BAA)
A critical aspect for businesses dealing with health information is securing a BAA with OpenAI or, if the OpenAI services are accessed via Microsoft Azure, with Microsoft directly. The demand for such agreements is high, leading to a bottleneck in customer support. This situation presents a challenge for companies looking to leverage AI solutions while complying with HIPAA.
Navigating the BAA process is more than a compliance hurdle; it’s a reflection of the growing need for robust data protection measures in the AI space. The struggle to secure a BAA underscores the importance of proactive engagement with privacy and security requirements in the evolving landscape of AI technology.
How Private AI Can Help
Where businesses are not able to obtain a BAA in due time, and also because it is good practice, a solution would be to approach HIPAA compliance via the Safe Harbour rule, always assuming of course that the use case is to disclose PHI. Removing the listed 18 identifiers from your data set permits the disclosure without anything further. This de-identified information is no longer considered PHI to which HIPAA applies. Alternatively, if you instead remove any information regarding a health condition, provision of health care to an individual, and the past, present, or future payment for the provision of health care to an individual, then the information would also no longer constitute PHI, but may still be personal information, the disclosure of which can be restricted under other applicable laws.
Private AI’s technology can detect, redact, and remove all 18 of the listed entities of the Safe Harbor rule (and more), greatly facilitating HIPAA compliance and enabling businesses to leverage LLMs safely. Beyond that, Private AI can also remove direct and indirect personal identifiers including health conditions as well as name, address, and numerical identifiers. You can test the technology on your own data, using our web demo, or sign up for an API key here.