ChatGPT’s Response to Privacy Questions

Share This Post

We asked ChatGPT five privacy questions covering the GDPR, HIPAA, and US and Canadian private sector privacy laws and evaluated its responses. We didn’t go easy on it, but ChatGPT hit it out of the park except, in two cases, when asked to provide accurate links to its resources. However, the resources were still easily located based on the information ChatGPT provided. This failure may well be due to the knowledge cut-off in September 2021. While the resources existed at that time, it’s possible the URL has changed since.

Question 1: In the context of the General Data Protection Regulation (GDPR), under what circumstances can an organization legally transfer personal data to a country outside the European Economic Area (EEA), and what additional safeguards are required to ensure adequate protection of personal data?

ChatGPT:

Private AI Commentary: One could take an issue with the introductory sentence here, as ChatGPT first indicates that the following measures are designed to provide adequate protection for data in countries where no such adequate protection standards exist, and then lists as the first measure the adequacy decision of the Commission which attests that a jurisdiction has applicable privacy standard that are adequate for the protection of the data transferred to that jurisdiction. Also, all of the listed safeguards require not just the putting in place of contracts, rules, certifications, etc., but their enforceability is also a prerequisite that ChatGPT does not mention. Furthermore, and this is not a factual mistake, the explicit consent exception mentioned under 3 is actually one of the derogations mentioned under 4. Aside from that, this response provides a decent summary of Chapter 5 of the GDPR.

Question 2: According to the Health Insurance Portability and Accountability Act (HIPAA), what are the key requirements and obligations for covered entities and business associates in terms of safeguarding protected health information (PHI) during transmission and storage?

ChatGPT:

Private AI Commentary: In light of the disclaimer at the bottom, this response is an acceptable summary of key requirements under HIPAA. Similar to the previous response, ChatGPT has listed “Transmission Security” as its own requirement, whereas HIPAA captures transmission security under the “Technical Safeguards” section along with access controls, audit logs, integrity, and authentication. While also not wrong, point 5 is not as informative as it could be. The second sentence is vague as to whether business associates are required to protect PHI as a result of the contract or because HIPAA applies to them as well (it’s the former.) It would also have been helpful to say what business associates are. A plus point is the fact that the provided link actually works and leads to further useful information.

Question 3: Under Canadian private sector privacy laws, what are the specific requirements for obtaining valid consent when collecting, using, or disclosing personal information, and how do these requirements differ from the consent requirements under GDPR?

ChatGPT:

Private AI Commentary: The disclaimer is again much needed in this case, as the answer is quite general and not entirely accurate for all Canadian laws. We also have to remember again that ChatGPT’s knowledge cut-off is in September 2021, hence relevant developments, particularly the coming into force of the Quebec private sector privacy law on September 22, 2021, may fall outside of the scope of what ChatGPT can draw upon. However, given the importance of this law for many Canadian industries, the fact that an active debate had been ongoing about its stricter approach to the protection of personal information, and the fact that the law is more closely aligned with the GDPR to which Canadian laws were supposed to be compared here, we would have expected specific mention of it. 

Question 4: When dealing with the rights of data subjects, what are the major differences between the General Data Protection Regulation (GDPR) and the US private sector privacy laws, particularly in terms of the scope of rights, the conditions for exercising those rights, and the processes for responding to data subject requests?

ChatGPT:

Private AI Commentary: This is our least favorite response by ChatGPT. To be fair, it’s also the hardest question because a comparison between the GDPR and “US private sector privacy laws” is difficult, precisely because the legislative landscape is such a patchwork in the US and many of the laws focus on the narrow scope of restricting the sale and sharing of personal information. The weakness of the response is, in our view, that ChatGPT nevertheless tries to force the response into the requested format of comparing the two, despite the fact that the objects of comparison differ so considerably – to such an extent that speaking about two objects of comparison is inappropriate in the context, leading to very generic and repetitive responses. It would have been helpful to start with the disclaimer that a comparison is difficult, and why. 

Question 5: What is PII, where was this term developed, and where is it defined? Please provide links to your resources.

ChatGPT:

Private AI Commentary: As alluded to above, the links provided do not work. The accurate links are:

https://www.whitehouse.gov/wp-content/uploads/legacy_drupal_files/omb/memoranda/2007/m07-16.pdf

https://csrc.nist.gov/publications/detail/sp/800-122/final

We also note that the definition of PII provided is a simplification of the ones given in the cited resources. 

The OMB defines PII as:

information which can be used to distinguish or trace an individual’s identity, such as their name, social security number, biometric records, etc. alone, or when combined with other personal or identifying information which is linked or linkable to a specific individual, such as date and place of birth, mother’s maiden name, etc.

And the NIST defines it, very similarly, as follows:

any information about an individual maintained by an agency, including (1) any information that can be used to distinguish or trace an individual‘s identity, such as name, social security number, date and place of birth, mother‘s maiden name, or biometric records; and (2) any other information that is linked or linkable to an individual, such as medical, educational, financial, and employment information.

Note that while the examples ChatGPT provides are not all included in the cited definitions, they are included in the body of these documents.

ChatGPT is also correct in noting that PII is commonly used in the US. Most other jurisdictions use “personal information” or “personal data” instead. 

Conclusion

We could not find any blatant factual errors in ChatGPT’s responses to our privacy questions, with the exception of faulty links when asked to provide the sources for its information. Much of the information provided was useful and presented in an intelligible and digestible manner. However, in some instances the answers could have been organized more in line with the privacy laws that were referenced. Also, when challenged with the comparison of very different approaches to individual’s rights under privacy laws, ChatGPT did not prove to be flexible enough to start with a warning in this regard but rather forced a comparison that ended up being fairly disorganized, overall of little use, and omitting similarities that are nevertheless worth noting. Much like ChatGPT itself, we’d recommend that anyone who wishes to make use of ChatGPT to learn about privacy legislation fact check what ChatGPT is providing and consider consulting resources written by experts when the questions concern intricate details with regard to data protection laws.

Subscribe To Our Newsletter

Sign up for Private AI’s mailing list to stay up to date with more fresh content, upcoming events, company news, and more! 

More To Explore

Download the Free Report

Request an API Key

Fill out the form below and we’ll send you a free API key for 500 calls (approx. 50k words). No commitment, no credit card required!

Language Packs

Expand the categories below to see which languages are included within each language pack.
Note: English capabilities are automatically included within the Enterprise pricing tier. 

French
Spanish
Portuguese

Arabic
Hebrew
Persian (Farsi)
Swahili

French
German
Italian
Portuguese
Russian
Spanish
Ukrainian
Belarusian
Bulgarian
Catalan
Croatian
Czech
Danish
Dutch
Estonian
Finnish
Greek
Hungarian
Icelandic
Latvian
Lithuanian
Luxembourgish
Polish
Romanian
Slovak
Slovenian
Swedish
Turkish

Hindi
Korean
Tagalog
Bengali
Burmese
Indonesian
Khmer
Japanese
Malay
Moldovan
Norwegian (Bokmål)
Punjabi
Tamil
Thai
Vietnamese
Mandarin (simplified)

Arabic
Belarusian
Bengali
Bulgarian
Burmese
Catalan
Croatian
Czech
Danish
Dutch
Estonian
Finnish
French
German
Greek
Hebrew
Hindi
Hungarian
Icelandic
Indonesian
Italian
Japanese
Khmer
Korean
Latvian
Lithuanian
Luxembourgish
Malay
Mandarin (simplified)
Moldovan
Norwegian (Bokmål)
Persian (Farsi)
Polish
Portuguese
Punjabi
Romanian
Russian
Slovak
Slovenian
Spanish
Swahili
Swedish
Tagalog
Tamil
Thai
Turkish
Ukrainian
Vietnamese

Rappel

Testé sur un ensemble de données composé de données conversationnelles désordonnées contenant des informations de santé sensibles. Téléchargez notre livre blanc pour plus de détails, ainsi que nos performances en termes d’exactitude et de score F1, ou contactez-nous pour obtenir une copie du code d’évaluation.

99.5%+ Accuracy

Number quoted is the number of PII words missed as a fraction of total number of words. Computed on a 268 thousand word internal test dataset, comprising data from over 50 different sources, including web scrapes, emails and ASR transcripts.

Please contact us for a copy of the code used to compute these metrics, try it yourself here, or download our whitepaper.