FTC's Kochava Ban Closes One Door on the Data-Broker Auction House
The most aggressive federal action against a data broker yet still leaves vast real-time bidding systems unscathed, underscoring how the surveillance economy remains largely intact.
vpnmentor.com
On May 5, 2026, the Federal Trade Commission announced a proposed settlement that will ban the data broker Kochava and its subsidiary Collective Data Solutions from selling, licensing, transferring, sharing, or disclosing sensitive location data without express consumer consent. The settlement, which resolves a lawsuit the FTC filed years earlier, covers precise geolocation information harvested from hundreds of millions of mobile devices, data detailed enough to trace a person to a doctor's office, a place of worship, or a domestic violence shelter.
The Kochava order is the most muscular enforcement action the United States has ever taken against a standalone data broker. But it is also, as Cobun Zweifel-Keegan, managing director at the International Association of Privacy Professionals, put it in an analysis published days after the settlement, a reminder that "Kochava is not enough." The company is one node in a sprawling data-broker ecosystem that operates largely through an auction layer most consumers have never heard of, where their location history, browsing patterns, and demographic profiles are priced and sold in milliseconds.
Understanding the Kochava case requires understanding the pipeline. App analytics firms embed software development kits into thousands of mobile applications. Those kits collect geolocation coordinates, device identifiers, IP addresses, and usage patterns. The data flows to aggregators like Kochava, which package it into audience segments, "new moms in Memphis," "truck drivers crossing state lines," "people who visited a cardiologist twice this quarter," and list those segments on programmatic advertising exchanges. Buyers bid on access through automated auctions that complete before the web page or app screen finishes loading. The FTC's complaint alleged that Kochava's data marketplace allowed purchasers to identify specific individuals and track them to sensitive locations, including reproductive health clinics.
The settlement prohibits Kochava from "selling, licensing, transferring, sharing, or disclosing" sensitive location data without affirmative express consent, as The Verge reported when the order was announced. The language is precise partly because the industry's business model depends on linguistic imprecision. For years, data brokers argued that consumers "consented" to location collection by accepting sprawling app permissions or clicking through terms of service that no ordinary person reads. The FTC's order rejects that framing entirely.
The ban settles the FTC's lawsuit alleging that Kochava sold sensitive geolocation details that could track people seeking or performing abortions., The Verge, reporting the FTC settlement terms, May 5, 2026
What the Kochava order does not address is the auction mechanism itself. The real-time bidding protocols that power digital advertising, the same infrastructure that lets a shoe brand target people who visited a competitor's website, are the plumbing through which sensitive location data flows. Every bid request sent to an ad exchange carries a payload of information about the user being auctioned: approximate location, device type, browsing context, and often a persistent identifier that can be linked across sessions, apps, and devices. The number of companies that receive this payload during a single auction is unknowable to the person being auctioned. The FTC's settlement with Kochava shuts down one seller. The auction house remains open.
The BleepingComputer report on the settlement noted that Kochava collected precise geolocation data from hundreds of millions of mobile devices. That scale is not unique to Kochava. Several firms operate comparable location-data marketplaces, and none of them are covered by this settlement. The data-broker ecosystem is not a handful of bad actors; it is an industry built on the premise that behavioral surveillance can be made consensual through notice-and-click regimes that everyone involved knows are performative.
The government is not merely a regulator of this market. It is a buyer. An NPR investigation published in March 2026 documented how Immigration and Customs Enforcement and other federal agencies purchase commercial data about Americans in bulk, sidestepping the warrant requirement that would apply if the same information were obtained directly from a telecom carrier. The data brokers who sell to advertisers are often the same ones who sell to law enforcement. The Kochava settlement does nothing to restrict that government purchasing pipeline.
Meanwhile, state legislatures are moving. On May 4, 2026, the Connecticut House gave final passage to SB 4, legislation that specifically regulates data brokers, defined in the bill as a business that sells or licenses brokered personal data. Connecticut's law joins a patchwork of state-level data-broker registries and deletion requirements that have emerged since California passed the first such law in 2018. These laws create modest transparency obligations, data brokers must register with the state and provide some mechanism for consumers to request deletion, but they do not prohibit the core transactions that sustain the industry.
For individuals trying to grasp the scope of what is collected about them, the picture is disorienting. A USA Today guide published in March 2026 described the experience of Googling your own name and discovering an address, phone number, or family link you never remember posting. The article recommended data-removal services like Incogni as a way to reduce exposure, but the subtext of that recommendation is the absence of any systemic fix. Consumers are being asked to opt out individually from an ecosystem that adds them back in the moment they install a new app, visit a new website, or walk past a Bluetooth beacon.
The consent architecture behind all of this is worth examining closely. Under the GDPR, the European Union's privacy regulation, companies must obtain freely given, specific, informed, and unambiguous consent before processing personal data. The standard is high and, in theory, enforced. In the United States, there is no equivalent federal law. The FTC has relied on its Section 5 authority to police unfair and deceptive practices, which is how it pursued Kochava. But unfairness and deception are after-the-fact standards applied through litigation. They do not require companies to design systems that protect privacy by default.
What the auction layer actually is and why it matters
The term "auction layer" refers to the programmatic advertising infrastructure that matches ad buyers with ad inventory in real time. When a person opens an app or loads a webpage, a bid request fires to an ad exchange. That request contains information about the available ad slot and, crucially, about the person whose attention is being sold. The exchange runs an auction among multiple potential buyers, completes in under 100 milliseconds, and serves the winning ad. Every entity that participates in the auction receives the bid request and its accompanying data. There is no practical limit on how many companies can participate, and there is no meaningful way for the person being auctioned to know who bid, who saw the data, or what they did with it afterward.
Location data is particularly dangerous in this system because it is effectively impossible to anonymize. A 2013 study published in Scientific Reports demonstrated that four spatio-temporal points, roughly the locations of a person at four specific times, are enough to uniquely identify 95 percent of individuals in a dataset of 1.5 million people. The mobile ad ecosystem generates hundreds of such points per person per day. When Kochava sold "precise geolocation data" from hundreds of millions of devices, it was selling information that could, with minimal computational effort, be linked back to specific human beings.
The settlement requires Kochava to obtain "affirmative express consent" before handling sensitive location data. The question that privacy researchers and civil-liberties advocates are now asking is whether any consent mechanism built atop the programmatic auction system can ever be meaningful. The European Court of Justice ruled in 2024 that the IAB Europe's Transparency and Consent Framework, the industry's primary consent mechanism for real-time bidding, violated the GDPR because it did not provide users with sufficient information about how their data would be processed. If the consent infrastructure itself is legally defective, adding a consent checkbox to Kochava's data pipeline does not solve the structural problem.
The IAPP's Zweifel-Keegan framed the settlement as a milestone that nonetheless highlights the limits of case-by-case enforcement. The FTC's litigation against Kochava took years. During those years, the company continued operating, and the market for location data continued expanding. A single consent order against a single broker, however stringent, cannot reconfigure an industry whose economic logic depends on mass surveillance as the default setting of digital life.
What federal legislation might restructure that default remains uncertain. The American Privacy Rights Act, the most comprehensive federal privacy bill introduced in recent years, has stalled repeatedly in Congress. Without a federal statute that establishes clear data-minimization requirements, limits on secondary use, and a private right of action, the FTC will remain in the position of playing whack-a-mole with individual data brokers while the auction infrastructure beneath them hums along unchanged.
What to watch for
The Kochava consent order includes compliance monitoring and reporting requirements that will produce public filings over the coming years. Those filings will offer a rare window into whether a data broker subject to a consent decree can actually operationalize meaningful consent, or whether the structure of the auction market makes compliance technically impossible. Privacy researchers and journalists should watch the FTC's enforcement docket for those reports.
Simultaneously, state attorneys general have begun using their own consumer protection authority to target data brokers. California, Texas, and New York have opened investigations into location-data practices that mirror the FTC's theory in Kochava. The Connecticut legislation creates a registration requirement that will, for the first time, generate a public inventory of data brokers operating in the state. When that registry goes live, anyone will be able to look up the companies that trade in their information. The data-broker ecosystem has operated in the dark for two decades. A growing number of regulators, and a growing number of readers, are turning on the lights.