Digital Administration and Data Protection: Tensions, Challenges, and Safeguards in the Age of Transparency

Electronic administration isn’t just a change of format; it’s a shift in mindset. Every time you enter a portal and click, you leave a trail that works both for auditing and surveillance. That supposed efficiency they promise also brings a fragility that not everyone talks about.

1) Electronic administration as opportunity and risk

Digital portals are convenient, sure: you can handle paperwork from home without waiting in line. But be careful, that convenience turns the citizen into someone who’s always being recorded. And not all of us want to live in that kind of permanent display case.

Valero Torrijos said it years ago: administrative efficiency is never neutral. The more digital the system becomes, the bigger the asymmetry with the public. And here’s my take: that unchecked power becomes dangerous. A mistake on paper could be corrected quickly; in digital format, an error gets cloned on a massive scale. Efficiency has a cost, and we’re not always warned about it.

2) Transparency and exposure

Transparency sounds great in a democracy: “more data, more trust.” But let’s be honest, absolute transparency borders on exhibitionism. Do we really need lists with full names floating around the internet? What was intended as a public service often ends up breaking social trust.

Cerrillo already warned that accountability can turn into a show. And he’s right. If we want useful portals, we can’t strip the citizen bare. Transparency is fine, but always with moderation. Personal dignity is worth more than a few extra clicks in a search engine.

3) The European reference framework

Europe has spent years reminding us that privacy isn’t a luxury—it’s a right. The GDPR was a turning point: paperwork alone isn’t enough; now you have to justify every technological decision. And honestly, that annoys more than one administration that would prefer to stay on autopilot.

The good thing is that this change forces us to take privacy seriously, as a daily habit, not a footnote.

4) New regulatory layers

The GDPR is the base, but it’s surrounded by a dizzying ecosystem of rules: ENS, Data Act, AI Act… Sometimes it feels like a legal sudoku. Still, there’s a clear idea: “I didn’t know” is no longer acceptable.

The AI Act, for example, requires algorithms to be auditable. And here’s the key: if an algorithm distributes scholarships, it can’t be a black box. Citizens have the right to understand what’s going on inside. I’ll take a stand: whoever fears transparency in algorithms has something to hide.

5) Processors and providers

Outsourcing tech services is fine, but it doesn’t free you from responsibility. If an administration hires a provider and then checks out, all it achieves is multiplying the gaps.

The APDCAT makes it clear: the ultimate responsible party is still the administration. Delegating isn’t the same as walking away. And citizens shouldn’t pay the price of institutional negligence.

6) Education and minors

Minors are always the most exposed. They post without thinking of the consequences, and the problem is that many adults don’t know how to guide them. This is where programs like CLI–PROMETEO come in: teaching everything from passwords to how to react to cyberbullying.

The GDPR already sets special conditions, but let’s be realistic: if the interface isn’t understandable for a 13-year-old, something’s wrong with the design. Education isn’t bureaucracy, it’s prevention.

7) Health data

The VISC+ case taught us something uncomfortable: total anonymity is an illusion. There’s always a risk of reidentification. And if you cross-reference several databases, that risk skyrockets.

Personalized medicine needs data, but how far can a patient be turned into an open book? Here we must be clear: science moves forward, but not at the cost of dignity.

8) Smart cities

Smart cities sound modern: sensors, cameras, data to manage traffic better. The problem is that without limits, we end up in an urban panopticon shaped like a traffic light.

APDCAT repeats it: privacy by design. Yes, we want innovation, but no one wants to live in a city that monitors them even when crossing a zebra crossing.

9) Privacy by design

Privacy by default isn’t legal posturing, it’s culture. It means thinking about protection from the start. Not as a patch, but as part of the system’s DNA.

There are plenty of examples: portals that only ask for the minimum or anonymized school records. What’s interesting is that when it’s done well, people barely notice. And that’s the point: ethics becomes good engineering.

10) Good practices

There are basic rules that should be tattooed on the skin of any administration: collect only necessary data, set retention periods, require strict provider contracts, conduct impact assessments, configure privacy by default, and carry out external audits.

Does it sound technical? Maybe, but we’re actually talking about politics. This is how the State shows that it respects its citizens.

11) Inevitable tensions

Transparency versus privacy, efficiency versus fairness… These tensions are inevitable. And let’s say it clearly: eternal balance doesn’t exist. Each decision opens a new front. The important thing is knowing where we tilt the scale.

12) Connections and cross-learning

Here’s what’s interesting: academic texts open debates, institutions try to respond, and schools turn it all into habits. No single layer is enough. The key is the dialogue among all of them.

If that coherence is missing, the system becomes a half-finished puzzle.

13) Future challenges

Artificial intelligence in public administration multiplies risks. This isn’t paranoia: a poorly designed algorithm can unfairly decide scholarships or aid. And although the AI Act requires transparency, we all know how easy it is to fall into the temptation of using black boxes.

Same with the European digital identity: practical, yes, but also a treat for hackers. If it fails, everything fails. Cutting corners on security isn’t an option.

14) Email at work

Email is still the go-to tool in offices and administrations. But beware: it’s not just a digital inbox; it’s also a showcase of how we handle information. Misuse can open the door to leaks or worse, sanctions.

A couple of basic recommendations:

  • Check the recipients every time. The “reply all” button can get you in trouble if you share sensitive data with the wrong person.
  • Avoid attaching documents with personal information unless necessary. And if there’s no other choice, protect them with a password.
  • Don’t mix work and personal matters. Using your work email for private issues becomes a risk (and is usually prohibited).

And what about signing or encrypting emails? Well, it depends on the level of security you need:

Digitally signing proves that the message comes from you and hasn’t been altered along the way. In official procedures or communications with legal value, it’s more than recommended.

Encrypting emails ensures that only the recipient can read the content. It’s not needed for everything, but if you work with health data, payroll, or disciplinary files, encryption should be the norm.

15) Security cameras and video surveillance

Security cameras are everywhere: at the subway entrance, in the office, even in residential buildings. And keep in mind, recording images isn’t free in terms of privacy. Every time a camera points at you, personal data is being processed (yes, your face counts as data).

Regulations say that cameras can only be used for their authorized purpose (usually security). You can’t install a camera “just in case” and then use it for something else. People must also know they’re being recorded: hence the famous yellow signs with the camera icon.

If the camera also captures sound, things get trickier: the level of intrusion is higher and only justifiable in very specific cases. And most importantly: recordings can’t be kept forever; after a reasonable period (in many cases, one month), they must be deleted.

ARCO Rights

The famous ARCO rights (Access, Rectification, Cancellation, and Opposition), explained as if we were having a coffee:

Access: It basically means you can ask, “What data of mine do you have?” And the company or administration must show you. It’s like checking the inventory of your own information.

Rectification: If you see something wrong (an old address, an error in your birth date, whatever), you can say: “Hey, fix this.” And it’s not a favor—it’s your right.

Cancellation: This is about deletion. If an entity no longer needs your data for the purpose it collected them, you can ask for them to be removed. But note: they can’t always delete them (for example, medical records or tax data must be kept by law).

Opposition: This is the “I don’t want you to use my data for this.” A typical example: a company has your data because you bought something, but you don’t want them to send you ads. You say “I object,” and they should stop.

Basically, ARCO rights are like a remote control for your personal data. You can view them, correct them, delete them (when appropriate), or stop certain uses. And the best part: you don’t need to give long explanations; you just ask, and the company or administration must respond within the deadline.

Regulations

Regulations governing the Authority

[CAT | ES | EN] Statute of Autonomy of Catalonia (arts. 4.1, 15, 20, 23, 27, 28, 30, 31, 76, 78, 156, 182.3) (DOGC no. 4680, 20.07.06)

[CAT | ES | EN] Law 32/2010, October 1, on the Catalan Data Protection Authority (DOGC no. 5731, 08.10.2010)

[CAT | ES | EN] Decree 48/2003, February 20, approving the Statute of the Catalan Data Protection Agency (DOGC no. 3835, 04.03.2003)

State regulations

[CAT | ES | EN] Spanish Constitution (arts. 10, 14, 16, 18, 20, 53, 105)

[CAT | ES | EN] Organic Law 15/1999, December 13, on Personal Data Protection (BOE no. 298, 14.12.1999)

[CAT | ES | EN] Royal Decree 1720/2007, December 21, on Personal Data Protection (BOE no. 17, 19.01.2008)

The Data Protection Advisory Council

Behind the Catalan Data Protection Authority there’s not only the team working day to day, but also a body that acts as a compass: the Advisory Council. Its mission is to provide insight, give independent opinions, and ensure that the Authority’s decisions aren’t made in a vacuum.

Some of its most relevant functions are:

  • Proposing to Parliament who should lead the Authority.
  • Reviewing and giving feedback on the instructions the APDCAT wants to approve.
  • Analyzing the annual budget before it moves forward.
  • Advising the director of the Authority on any issue submitted to them.
  • Reporting on the staff structure, both permanent and temporary.
  • And also producing studies and recommendations on how to improve data protection in Catalonia.

Plainly put: the Advisory Council doesn’t execute, but it sets the direction. It’s like that group of people with perspective and experience who put on the table what’s sometimes hard to see from the inside.

Our closing

Digital administration is rewriting the social contract. Every decision about data shapes the quality of our democracy. And like it or not, trust is more fragile than any IT system.

Speed matters, but fairness matters more. In the end, digital democracy won’t be judged by how fast its procedures are, but by how it respects the privacy of those on the other side of the screen.