Third-party pixels in ChatGPT, Claude, Grok, and Perplexity: Google Analytics collects the title and URL of your conversations

An IMDEA Networks study detects third-party trackers in ChatGPT, Claude, Grok, and Perplexity. It warns that these mechanisms can transfer metadata and sensitive data to external servers.

07 of may of 2026 at 11:03h
Third-party pixels in ChatGPT, Claude, Grok, and Perplexity: Google Analytics collects the title and URL of your conversations
Third-party pixels in ChatGPT, Claude, Grok, and Perplexity: Google Analytics collects the title and URL of your conversations

A study by the IMDEA Networks Institute reveals that ChatGPT, Claude, Grok, and Perplexity incorporate third-party trackers that compromise user privacy. The research indicates that these artificial intelligence tools allow companies like Google or Meta to access sensitive conversation data.

Trackers extract metadata from queries

Technical analysis detected the systematic presence of tracking pixels in the free versions of these services. Google Analytics collects the conversation title and URL when users interact with ChatGPT without additional protection. This practice transfers contextual information to external servers unrelated to the main provider.

The situation repeats itself on other leading platforms in the sector. Meta and TikTok pixels operate in Grok to collect web addresses and chat titles. In the case of Perplexity and Claude, these mechanisms share accepted cookies, conversation addresses, and technical metadata with the social media companies.

Perplexity presents an added vulnerability for those who do not identify themselves on the platform. Conversations held by users who have not logged in remain publicly accessible. This configuration exposes the content of the queries to anyone who accesses the link or searches for snippets of the indexed text.

Researchers warn about sensitive data inference

Narseo Vallina and Jorge García Herrero, researchers affiliated with IMDEA Networks or the Universidad Carlos III de Madrid, underscore the seriousness of this exposure. Chats with virtual assistants often contain intimate information because users perceive these tools as spaces of absolute trust.

"Depression tips, aggressive tax deductions, I've been cheated on, lose even more weight, a strange cough... You don't need to be Sherlock Holmes to infer the content of the conversation" - Jorge García Herrero, researcher at IMDEA Networks Institute

The lawyer specifies that the study proves the potential access of third parties such as Google or Meta, but does not demonstrate their effective use for commercial profiling. Nevertheless, he considers that allowing the entry of agents external to the current data processing constitutes a serious infringement for artificial intelligence providers.

About the author
Redacción
See biography