Perplexity s "Incognito Mode" is a "sham," lawsuit says
Perplexity s "Incognito Mode" is a "sham," lawsuit says
Perplexity’s AI search engine encourages users to go deeper with their prompts by engaging in chat sessions that a lawsuit has alleged are often shared in their entirety with Google and Meta without users’ knowledge or consent.
“This happened to every user regardless of whether or not they signed up for a Perplexity account,” the lawsuit alleged, while stressing that “enormous volumes of sensitive information from both subscribed and non-subscribed users” are shared.
Using developer tools, the lawsuit found that opening prompts are always shared, as are any follow-up questions the search engine asks that a user clicks on. Privacy concerns are seemingly worse for non-subscribed users, the complaint alleged. Their initial prompts are shared with “a URL through which the entire conversation may be accessed by third parties like Meta and Google.”
Disturbingly, the lawsuit alleged, chats are also shared with personally identifiable information (PII), even when users who want to stay anonymous opt to use Perplexity’s “Incognito Mode.” That mode, the lawsuit charged, is a “sham.”
“‘Incognito’ mode does nothing to protect users from having their conversations shared with Meta and Google,” the complaint said. “Even paid users who turned on the ‘Incognito’ feature still had their conversations shared with Meta and Google, along with their email addresses and other identifiers that allowed Meta and Google to personally identify them.”
Financial, health info allegedly shared
The “extreme” privacy complaints arose in a proposed class action filed Tuesday by an anonymous Perplexity user, John Doe.
In his complaint, he likened ad trackers to “browser-based wiretap technology” that lets Google and Meta snoop on private Perplexity chat logs.
In violation of state and federal laws, he alleged, the AI firm never disclosed to users that it secretly uses tech giants’ ad trackers. His lawsuit targets all three companies, accusing them of putting profits over users’ privacy rights by seizing sensitive data that users did not realize would be shared.
For Doe, he was “dismayed” to learn that complete and partial transcripts of chats discussing his family’s financial data were seemingly shared with Google and Meta, allegedly alongside PII. He relies on Perplexity to help manage his taxes, get legal advice, and make investment decisions, his complaint said. Without an injunction blocking Perplexity’s allegedly ongoing privacy harms, he will be blocked from using his preferred search engine, he complained.
Other users in the proposed class most likely turned to Perplexity when researching other sensitive topics, the lawsuit alleged. According to the lawsuit, the companies designed ad trackers to operate “surreptitiously” so that they could allegedly “exploit this sensitive data for their own benefit, including targeting individuals with advertising and reselling their sensitive data to additional third parties.”
Perhaps most troublingly, people frequently use such AI systems to research health and medical information, particularly when consulting with a human might be embarrassing or upsetting.
Supposedly capitalizing on users’ tendency to overshare with AI systems, Perplexity is seemingly trained to request that users upload sensitive records during chat sessions, the complaint said. That includes information that, if shared with Google and Meta, could result in users suddenly being targeted with advertisements that they “may find overwhelming, disturbing, or, in many instances, physically deleterious,” the complaint said.
For example, Perplexity responds to a basic prompt like “What is the best treatment for liver cancer?” by volunteering that “I can help you interpret a specific scan report, biopsy result, or proposed treatment plan if you share more details,” the complaint noted.
Among invasive trackers embedded in Perplexity’s AI search engine are the Facebook Meta Pixel, Google Ads, and Google Double Click, as well as possibly a technology that Meta calls “Conversions API,” the lawsuit said. Meta allegedly recommends that partners use that last technology in combination with the Meta Pixel, because it supposedly serves as a “workaround” that prevents “savvy users” from blocking Pixel tracking, his complaint said. Notably, Meta has been hit with several privacy lawsuits opposing that tech, with some settlements, while Congress has dinged some former partners who used trackers from Google and Meta.
Class action spans three years of chats
As AI tools like Perplexity’s have become ubiquitous, users have good reason to be concerned that their prompts and chats may be leaked. ChatGPT logs were recently shared with news organizations in a major copyright litigation and leaked in Google searches and analytics tools.
If users knew about Perplexity’s ad trackers sharing transcripts of chats, they wouldn’t use Perplexity’s search engine in the same way, Doe’s lawsuit alleged.
“No reasonable person would have expected that Perplexity would share complete transcripts of their conversations with Perplexity’s AI Machine with companies like Meta and Google,” the complaint said. “But that is what Perplexity did.”
The proposed class covers certain Perplexity users nationwide whose chats were allegedly shared with Google and Meta between December 7, 2022, and February 4, 2026. There is also a separate subclass for California users pursuing additional claims. Neither class nor the subclass covers paid “Perplexity Pro” and “Perplexity Max” subscribers, because Doe never accessed those tiers of services and cannot adequately represent their interests, the lawsuit noted.
Google, Meta, and Perplexity could face substantial fines in a loss, with perhaps millions of chat logs involved and potential statutory damages that could exceed $5,000 per violation. Doe also seeks punitive fines as well as disgorgement for any unjust enrichment, as the companies allegedly used the sensitive information to improve their products, their own marketing, and their targeted advertising.
Perplexity hides privacy policy
In addition to allegedly violating laws, companies are accused of infringing their own privacy policies and terms of use by collecting and sharing sensitive data.
Specifically, Google and Meta are accused of failing to enforce policies prohibiting the disclosure of confidential or sensitive information through the use of their trackers. Those policies only exist to create “plausible deniability” to help the tech giants dodge lawsuits, the complaint alleged.
The complaint noted that Perplexity never asks users to agree to its privacy policy, and there is no link to the privacy policy on the search engine’s homepage. That’s different from popular search engines like Google (where the privacy policy is in the footer) and Bing (which links its policy in a menu).
Perplexity users would have to rely on a search engine to find the policy, the complaint said, and even then, there would seemingly be no way to detect the invasive ad tracking. In the section of its privacy policy that discusses tracking, Perplexity does not mention specific trackers. Instead, it warns users that it does not allow “do not block signals” and advises that attempts to block trackers could affect services.
“Perplexity’s failure to inform its users that their personal information has been disclosed to Meta and Google or to take any steps to halt the continued disclosure of users’ information is malicious, oppressive, and in reckless disregard” of users’ rights, the lawsuit alleged.
Perplexity’s privacy policy does emphasize that the company does not “‘sell’ or ‘share’ sensitive personal information for cross-context behavioral advertising.”
Ars could not immediately reach Perplexity or Meta for comment.
Google’s spokesperson provided Ars with a statement that suggested it wasn’t responsible for Perplexity’s alleged failure to properly disclose how chats could be shared.
“Businesses manage the data they collect and are responsible for informing users about it,” Google’s spokesperson said. “By default, data sent to Google Analytics for measurement does not identify individuals, we have strict policies against advertising based on sensitive information, and we don’t sell personal information.”
Doe is hoping a jury will find that Perplexity’s ad trackers are unlawful and order injunctive relief, restitution, and disgorgement, as well as a range of damages. Without an injunction, the only remedy Perplexity users will have is allegedly paying costly services to prevent disclosures of “their most sensitive and personal information,” his lawsuit claimed.
“Nothing on Perplexity’s website warns users that their conversations with its AI Machine will be shared with Meta and Google,” Doe alleged. “Much less does Perplexity warn subscribed users that its ‘Incognito Mode’ does not function to protect users’ private conversations from disclosure to companies like Meta and Google.”
Ashley is a senior policy reporter for Ars Technica, dedicated to tracking social impacts of emerging policies and new technologies. She is a Chicago-based journalist with 20 years of experience.
30 Comments
Ars Technica AI
https://arstechnica.com/tech-policy/2026/04/perplexitys-incognito-mode-is-a-sham-lawsuit-says/Sign in to highlight and annotate this article

Conversation starters
Daily AI Digest
Get the top 5 AI stories delivered to your inbox every morning.
More about
perplexity
An End-to-End Model for Logits-Based Large Language Models Watermarking
arXiv:2505.02344v3 Announce Type: replace Abstract: The rise of LLMs has increased concerns over source tracing and copyright protection for AIGC, highlighting the need for advanced detection technologies. Passive detection methods usually face high false positives, while active watermarking techniques using logits or sampling manipulation offer more effective protection. Existing LLM watermarking methods, though effective on unaltered content, suffer significant performance drops when the text is modified and could introduce biases that degrade LLM performance in downstream tasks. These methods fail to achieve an optimal tradeoff between text quality and robustness, particularly due to the lack of end-to-end optimization of the encoder and decoder. In this paper, we introduce a novel end-
Knowledge Map
Connected Articles — Knowledge Graph
This article is connected to other articles through shared AI topics and tags.






Discussion
Sign in to join the discussion
No comments yet — be the first to share your thoughts!