7 out of 10 users interact with AI without knowing it, according to a digital law expert

"When you upload a document, you lose control over it. Nothing is free. That information is usually stored in public clouds and can be reused" - Lorena Naranjo Godoy, UDLA

03 of january of 2026 at 07:15h
7 out of 10 users interact with AI without knowing it, according to a digital law expert
7 out of 10 users interact with AI without knowing it, according to a digital law expert

The increasing use of Artificial Intelligence in digital services and platforms in Catalonia poses new challenges regarding privacy, ethics, and personal autonomy. Lorena Naranjo Godoy, director of the Master's in Digital Law at UDLA and head of the Digital Law Area at the Spingarn Law Firm, warns about the risks of delegating human thought to automated systems, especially when dealing with personal data and decision-making in sensitive areas.

Concern over privacy and personal data control

According to Naranjo Godoy, many people in urban environments like Barcelona, Girona, or Tarragona use customer service chatbots and educational platforms without being fully aware that they are interacting with Artificial Intelligence systems. **The main risk identified is the loss of control over documents and personal data uploaded to these platforms**. The expert warns that information can be stored in public clouds and reused, creating the need for extreme caution in sectors such as security or confidential information management

"When you upload a document, you lose control over it. Nothing is free. That information is usually stored in public clouds and can be reused" - Lorena Naranjo Godoy, UDLA

Impact on Education and Decision-Making

In the educational sphere, Artificial Intelligence allows for personalized feedback and processes adapted to each student's pace. However, Naranjo Godoy emphasizes that technology should not become a shortcut that replaces real learning or critical analysis. The use of AI as a tool can improve efficiency, but supplanting human reasoning generates risks of errors, biases, or "hallucinations" in the results.

The expert also points out that cases have been detected where young people consult AI systems to decide how to vote or what political stance to adopt, which shows a growing influence on collective and personal decisions

Ethics and Responsibility in the Development and Use of AI

Naranjo Godoy distinguishes two levels of responsibility. On the one hand, the design of AI, which must foresee ethical dilemmas in risky situations, as happens with autonomous vehicles. On the other, the use people make of it by accepting AI-generated responses as absolute truths without verification. Cases have been documented where professionals have presented false information generated by AI, with serious consequences.

The expert insists on the need for international ethical frameworks and shared responsibility among the state, companies, and users. The state must define clear limits on which decisions can be automated, companies must protect personal data, and users must avoid uploading sensitive information and not delegate their judgment

Challenge for Autonomy and Coexistence with Technology

The integration of Artificial Intelligence into administrative processes and daily services in Catalonia is irreversible. Naranjo Godoy states that the main challenge is learning to coexist with this technology without losing autonomy or critical thinking skills. The key lies in enhancing human abilities and supervising the use of AI, preventing it from becoming a substitute for personal relationships or one's own thinking.

"It's not just about knowing how to use technology. It requires awareness. Young people need to understand how to integrate Artificial Intelligence into their life projects without affecting their relationships, their mental health, or their privacy"

The debate on the role of Artificial Intelligence in Catalan society remains open, with a focus on the protection of digital rights and the training of critical and responsible users.