Ever since its launch in November 2022, ChatGPT has gained a lot of traction for the way it is revolutionizing the internet. At work or at home, for writing articles, e-mails, posts on social networks, summaries of long texts and even IT codes, it has become the ultimate productivity hack. However, the attractive capabilities of ChatGPT seem to have created a blind spot around the potential dangers of data breaches. While ChatGPT does not store this kind of information long term, there are some risks linked to its use.
Conversations remain stored and accessible to third parties
The company, OpenAI, openly praises the way its artificial intelligence models can be fine tuned over time, which it describes as the most useful and promising feature. However, ongoing improvement to its applications is made possible, among other things, thanks to exposure to real world problems and data. In other words, data and conversations laid out on ChatGPT are read, stored and used for training purposes to provide better service.
On the ChatGPT FAQ page, OpenAI points out that users’ conversations are « stored on OpenAI systems and our trusted service providers’ systems in the United States and the rest of the world. » You can also read that some portions of exchanges may be sent to third parties for the purpose of data annotation, security and improving support. While bound by confidentiality obligations, certain authorized personnel as well as specialized subcontractors can also post and examine a user’s content. Knowing that several intermediaries have access to users’ histories, do we need to remind you that it is extremely important not to share confidential information?
ChatGPT is not immune to hackers
When using any good application or a third party service, exposing data carries the risk of being hacked. ChatGPT is no exception. Even if it has been developed with security measures in place, there is always the possibility of unforeseen vulnerabilities arising. Cybercriminals are able to exploit breaches to access personal data or disrupt the functioning of a system. In addition, we learned that in March 2023, a bug had been discovered after having exposed the titles, the first message of new conversations and payment information of ChatGPT Plus users.
In any event, to mitigate these data breach risks, we recommend that you limit providing private information when using ChatGPT or any other online service. We also strongly recommend consulting the confidentiality and security policy of the platform or application that you are using so as to properly understand how your conversations are being processed and protected. If in doubt, do not hesitate to contact one of our experts at MicroAge who will be happy to assist you.
3 Critical Cyber Threats For Businesses in 2019
Malware, vulnerabilities, and social engineering are some of the main concerns for IT security professionals. Although the tactics used to target businesses and individuals are…
5 Benefits of an Optimized IT Infrastructure
Is your current IT infrastructure helping your business thrive in its industry or creating obstacles for growth? If you’re still not using cloud technologies to…
3 Advantages of Using Cloud Infrastructure
Everyone knows that cloud computing is a hot trend, and its adoption should only increase over the next few years. According to one study published…
The Top 5 Office 365 Collaboration Tools You Need in Your Workspace
Office 365 is Microsoft’s latest business technology product and includes some of the most versatile and game-changing programs available to business owners. However, it can…
The Dark Web: What Is It and Why Should You Care?
The internet is a space that is made up of good and bad. The internet is paradoxically composed of three layers, particularly the deep web,…