Your chats with AI can train the model: this is how your data is stored without you seeing it

According to Welivesecurity, conversations held on these platforms can be used to train artificial intelligence models if the default configuration is maintained.

14 of february of 2026 at 18:45h
Your chats with AI can train the model: this is how your data is stored without you seeing it
Your chats with AI can train the model: this is how your data is stored without you seeing it

Artificial intelligence platforms such as ChatGPT and Gemini present risks to the information security of sensitive data in companies and individuals in Barcelona, Girona, and Tarragona. Cybersecurity experts warn of the importance of not sharing personal, banking, or password data through these systems, as they are not designed to protect critical information.

Warnings about the use of AI in corporate environments

According to Welivesecurity, conversations held on these platforms can be used to train artificial intelligence models if the default configuration is maintained. This implies that any data entered could be stored and accessible outside the corporate environment. The recommendation is clear: financial reports, strategic data, customer lists, or confidential projects should not be shared under any circumstances on these channels.

Risk of exposure to unauthorized access

If a cybercriminal manages to access the user account, all information shared on the platform can be exposed. The Gemini and ChatGPT platforms are not encrypted environments nor do they offer protection guarantees for critical data. For this reason, the internal policies of many organizations in Catalonia require that sensitive business data only be managed in authorized tools and under strict security protocols.

Limitations in advice and protection

Artificial intelligence systems do not replace the professional judgment of doctors, lawyers, or financial advisors. Nor do they have all the necessary details and context to resolve personal or business cases appropriately. The information entered may be stored and used by the platform, which increases the risk of data leakage outside the control of the company or the user.

Recommendations and prevention measures

  • Reporting any incident or suspicious behavior to the platform helps protect the account and other users.
  • Avoid sharing sensitive or confidential information in unauthorized environments.
  • Always consult the organization's internal security policies before using artificial intelligence tools to process business data.

"The main rule in cybersecurity is never share personal, banking, or password data, or any sensitive information with an artificial intelligence" - Spokesperson, Welivesecurity