The CEO of the developer company ChatGPT officially stated that any information transmitted by users to the chatbot, if necessary, can be used in court proceedings. This statement updated the issue of legal protection of privacy when interacting with artificial intelligence and identified gaps in current international legislation.
Today, millions of users interact with AI models every day, including ChatGPT, not only for business or technical tasks, but also for personal issues — from relationships to psychological support. A chatbot is increasingly becoming a substitute for a therapist, mentor, or trusted interlocutor. However, unlike communication with a doctor, lawyer, or clergyman, which is protected by appropriate professional privileges, correspondence with AI does not fall under the same legal guarantees.
According to the head of the company, in case of filing a lawsuit or legal investigation, the company may be required to provide the user's correspondence data at the request of the court or investigation. This situation, according to experts, creates serious risks for users, especially in cases where sensitive topics are discussed, including family problems, health issues, or business information.
Despite the fact that chats in the free version of ChatGPT are deleted after 30 days, the company reserves the right to save data in certain cases — in particular, for legal or security reasons. This provision is also spelled out in the user agreements that each user agrees to when registering.
Additional complexity is created by the current legal process, in which an AI developer is involved on demand to provide correspondence histories of millions of users. The only exceptions are corporate clients whose data is protected by a different level of confidentiality.
Against the background of rapid development of AI and large-scale user involvement in interactive interfaces, the lack of a clear regulatory policy in the field of digital privacy is becoming a challenge not only for businesses, but also for human rights structures. It is becoming clear that AI as a participant in communication requires a new level of legal assessment.