With the release of Open AI’s ChatGPT in 2022, AI-powered chatbots have become ubiquitous as other companies have also released their versions of generative AI applications, such as Microsoft’s Copilot or Google’s Gemini. Large language models (LLMs) are capable of storing data and digging information from past interactions, which is enough reason to be incredulous about providing the model with sensitive personal information.
AI companies train their models with data, including information from the model’s interaction with users. An AI model processes the data it learns from you and archives information in its memory. There is a good chance that your sensitive data will be shared across the internet, endangering privacy and security. When interacting with an AI-powered chatbot, remember that the information you share will not remain private. Confidential information should remain confidential.
Types of information never to share with AI
Some types of information should never be shared with AI-powered chatbots for security and privacy reasons, including records that contain personal identification, financial accounts, proprietary company secrets, and digital credentials.
Personal identification
Social Security numbers, school records, or driver’s licenses contain information about your identity and should best be kept away from the eyes of AI companies hungry for data to train their models. So do medical results — and when using ChatGPT to analyze a patient’s medical history, any personal information that reveals patient identity should be redacted.
Proprietary corporate data
A company’s proprietary information should not leave its premises — telling an AI-powered chatbot about it risks exposing the company’s trade secrets. There should be strict organizational rules on the use of AI especially when the company’s bread-and-butter is on the line.
Financial accounts
Bank account numbers or investment records should be protected and not be shared to avoid being exposed to cybercriminals lurking in the virtual darkness waiting for their prey. Telling chatbots about sensitive financial information makes it easier for these hackers and scammers to break in. Keep in mind that cybercrime is a multi-billion dollar industry — refrain from telling ChatGPT or Gemini information that could lead to financial ruin.
Digital credentials
Most people’s lives have been linked digitally, whether they’re using apps on mobile phones or institutional databases that require digital credentials. User credentials such as usernames, passwords, PINs, and OTPs are risky and dangerous to tell to chatbots.
The more you share information with the chatbot, the more the chatbot learns about you and the higher the risk of exposing such personal information to the public. ChatGPT, Copilot, and Gemini have issued warnings against sharing personal and sensitive user information that might get compromised once captured by chatbot models.
ChatGPT, Copilot, or Gemini might prompt you to share more information to stay in the conversation, but this might be a trap to extract more sensitive user information. That’s because they are designed and trained to interact with users.
“Just remember,” Nicole Nguyen wrote in the Wall Street Journal. “The chatbots are programmed to keep the conversation going. It’s up to you to hold back — or hit ‘delete.’”