ChatGPT: Data protection complaint against OpenAI due to false information

The civil rights organization Noyb has filed a complaint against OpenAI due to data protection concerns and is demanding GDPR-compliant data processing.

Save to Pocket listen Print view
ChatGPT-App auf einem Smartphone

(Bild: Tada Images/Shutterstock.com)

2 min. read
This article was originally published in German and has been automatically translated.

The Austrian civil rights organization Noyb has filed a complaint about OpenAI with the Austrian data protection authority. The authority is to examine OpenAI's data processing for its large language models used for ChatGPT. In addition, Noyb is demanding that the authority bring OpenAI into GDPR-compliant data processing and impose a fine so that the company complies with the regulations in the future.

According to Noyb, OpenAI is not processing his request to correct or delete the incorrect data. It is not possible to correct data. OpenAI has stated that it can filter or block data for certain requests, such as the name of the complainant. However, this does not work without filtering all information about the complainant.

According to Noyb, OpenAI "openly admits that it cannot correct incorrect information about individuals and cannot provide information about where the data comes from or what data is stored about individuals". Maartje de Graaf, privacy lawyer at noyb: "Inventing false information is problematic in itself. But when it comes to false information about individuals, it can have serious consequences. [...] If a system cannot provide accurate and transparent results, it cannot be used to generate data about individuals. The technology must follow the legal requirements, not the other way around."

According to Article 16 GDPR, data subjects have the right to have their data rectified if it is inaccurate. They can also request that incorrect information be deleted. In addition, according to the "right of access" in Article 15, companies must be able to prove what data they store about individuals and what the sources are.

Since the general public was given access to ChatGPT at the end of 2022, the OpenAI has repeatedly come under fire for data protection problems and was temporarily unavailable in Italy, for example, due to data protection issues. Hallucinations that throw up generative language models are regularly criticized, as they can probably never be fully controlled.

(mack)