ChatGPT: What you should never share with chatbots

Science - Technology
Tue, 1 Apr 2025 8:13 GMT
Artificial intelligence can be a valuable tool, but caution is needed in its use - How to protect yourself.
ChatGPT: What you should never share with chatbots

The presence of ChatGPT in our daily lives is becoming more and more prominent, with users relying on its answers for a wide range of topics. But as people learn from chatbots, so too do they learn from them; often through conversations that include personal information.

This is because the more data one provides to the system, the more accurate the answers it receives. From medical tests to snippets of programming code, many users trust ChatGPT with content of a sensitive nature, the Wall Street Journal reports.

However, experts in the field of artificial intelligence urge caution: there is information that should not be shared with chatbots – not only for privacy reasons, but also to avoid potential misuse or leakage of data.

Warnings from the creators themselves

Even the very companies developing the AI systems recognize the risks. OpenAI urges:

“Please do not share sensitive information in your conversations“, while Google urges Gemini users to be equally cautious: “Do not enter confidential information or data you wouldn’t want anyone to see.”

The potential for a leak is not just theoretical. As the WSJ notes, conversations with chatbots may be used to train future models, and keywords related to security issues such as violence may trigger internal audits by the company itself.

Five categories of data you shouldn’t share

According to expert recommendations, these are the five main types of information you should never share with ChatGPT or another similar system:

1. Personal identifying information

ID or passport number, VAT number, date of birth, home address and phone number are information that should be kept out of conversations with AI.

2. Medical data

While it is tempting to request interpretation of medical tests, it is important to remove all personal information from documents before any submission.

3. Financial information

Avoid entering bank account numbers or other sensitive financial information.

4. Details of your work

Users who use chatbots in their daily work routine often unknowingly expose trade secrets, customer data or internal information of their companies. If the use of AI is essential to the job, it is recommended that professional subscriptions be utilized.

5. Login details

Chatbots are not built to act as repositories for passwords, PINs or other access credentials.

What to do to protect yourself

Managing privacy is the responsibility of the user as well.

Experts recommend:

– Regularly delete the history of your conversations with chatbots.

– Use temporary chats, equivalent to the “Incognito Mode” of browsers, so that your information is not stored.

– Artificial intelligence can be a valuable tool. However, caution is required when using it, because the more we tell it, the more it learns.

protothema

MILLET MEDIA OE.
BİLAL BUDUR & CENGİZ ÖMER KOLLEKTİF ŞİRKETİ.
Address: Miaouli 7-9, Xanthi 67100, GREECE.
Tel: +30 25410 77968.
Email: info@milletgazetesi.gr.