Behavioural advertising is out of control, warns UK watchdog
The online behavioural advertising industry is illegally profiling internet users.
That’s the damning assessment of the U.K.’s data protection regulator in an update report published today, in which it sets out major concerns about the programmatic advertising process known as real-time bidding (RTB), which makes up a large chunk of online advertising.
In what sounds like a knock-out blow for highly invasive data-driven ads, the Information Commissioner’s Office (ICO) concludes that systematic profiling of web users via invasive tracking technologies such as cookies is in breach of U.K. and pan-EU privacy laws.
“The adtech industry appears immature in its understanding of data protection requirements,” it writes. “Whilst the automated delivery of ad impressions is here to stay, we have general, systemic concerns around the level of compliance of RTB.”
As we’ve previously reported, multiple complaints have been filed with European regulators arguing that RTB is in breach of the pan-EU General Data Protection Regulation (GDPR), including the ICO.
The U.K. watchdog has not yet issued a formal legal decision against RTB. But with this report it’s giving the industry a clear signal that practices must change.
Its full list of conclusions is well worth reading — so we’ve pasted it below, along with our own “plainer English” paraphrasing of what’s actually being said (formatted in italics):
1. Processing of non-special category data is taking place unlawfully at the point of collection due to the perception that legitimate interests can be used for placing and/or reading a cookie or other technology (rather than obtaining the consent PECR [Privacy and Electronic Communications Regulations] requires).
The ICO has found that consents for dropping trackers like cookies are not being legally obtained. The law requires obtaining consent before dropping and/or reading from a tracker. This means internet users must be asked for consent before tracking starts happening, and also — at the point they are asked — provided with ”clear and comprehensive information” about what’s intended in order that they can make a free and informed choice about whether they want to consent or not. Whereas what’s happening now is web users are being tracked without being asked if that’s okay and also without the extent and implications of all this mass surveillance being made plain to them.
2. Any processing of special category data is taking place unlawfully as explicit consent is not being collected (and no other condition applies). In general, processing such data requires more protection as it brings an increased potential for harm to individuals.
Sensitive personal data (such as political views, health information, sexual orientation) is being processed by the behavioural advertising industry — but not legally because, under U.K. and EU law, handling this sort of information requires a higher standard of explicit consent, given there are much greater risks of harms were it to be misused or go astray. The problem is the adtech industry is not asking internet users for explicit consent to make and share these sensitive inferences — likely because if a pop-up asked you to agree to, for example, your political or sexual preferences being broadcast to hundreds of advertisers you’d be sure to click ‘hell no.’ Trying to get around the law by just not asking also isn’t legal.
3. Even if an argument could be made for reliance on legitimate interests, participants within the ecosystem are unable to demonstrate that they have properly carried out the legitimate interests tests and implemented appropriate safeguards.
Here the ICO is doubly crushing the industry’s bogus reliance on claiming what’s known as ‘legitimate interest’ as the legal basis for violating internet users’ personal space and intimacy by spying on them. Even if it were possible to use this basis for this data purpose, the watchdog points out they haven’t even fulfilled the standard for LI — which requires carrying out various assessments and taking steps to secure people’s data. What’s actually happening is RTB does the equivalent of blasting everything it knows about you through a giant global megaphone. So, er, not at all safe then.
4. There appears to be a lack of understanding of, and potentially compliance with, the DPIA requirements of data protection law more broadly (and specifically as regards the ICO’s Article 35(4) list). We therefore have little confidence that the risks associated with RTB have been fully assessed and mitigated.
The ICO says it believes the adtech industry has also failed to do due diligence on RTB — because it’s found companies haven’t even bothered to carry out data protection impact assessments (DPIAs). That, in turn, suggests they haven’t even tried to get a handle on privacy risks, and therefore are demonstrably not making any effort to try to reduce those risks. Epic fail.
5. Privacy information provided to individuals lacks clarity whilst also being overly complex. The TCF and Authorized Buyers frameworks are insufficient to ensure transparency and fair processing of the personal data in question and therefore also insufficient to provide for free and informed consent, with attendant implications for PECR compliance.
What’s being said here is that privacy polices and consent pop-ups are horribly confusing — which means internet users have little hope of understanding what on earth they’re being asked to agree to. Yet for consent to be legal, people need to understand that. The ICO also specifically calls out industry mechanisms created by the Internet Advertising Bureau and Google for publishers and advertisers to gather consents as falling short of the legal standard. So, again, another major, major fail.
6. The profiles created about individuals are extremely detailed and are repeatedly shared among hundreds of organisations for any one bid request, all without the individuals’ knowledge.
If you thought internet ads were creepy, here’s the proof: The ICO is saying the behavioural advertising industry’s mass surveillance of web users results in all of us being profiled in crazy detail — and those spy files then being routinely handed off to (at least) hundreds of companies who are involved in the adtech chain every time there’s a programmatic ad transaction. These Stasi-esque dossiers are also being handed over, no strings attached, billions of times per day — so goodness knows where they end up. Still browsing comfortably?
7. Thousands of organisations are processing billions of bid requests in the UK each week with (at best) inconsistent application of adequate technical and organisational measures to secure the data in transit and at rest, and with little or no consideration as to the requirements of data protection law about international transfers of personal data.
Here the watchdog makes it clear that it agrees with the substance of the RTB complaints — i.e. that people’s information is not being lawfully handled because it’s not being properly protected. It also essentially makes the point that these illegal spy files could end up in Timbuktu and you’d be none the wiser.
8. There are similar inconsistencies about the application of data minimisation and retention controls.
If all that wasn’t enough, the ICO is saying the adtech industry is failing on other core legal requirements to collect as little data as possible and to place strict limits on how long it keeps data. Insert your own *unsurprised face.*
9. Individuals have no guarantees about the security of their personal data within the ecosystem.
If it wasn’t already really obvious, the watchdog rams the point home: Basically, behavioural advertising is out of control.
“The processing operations involved in RTB are of a nature likely to result in a high risk to the rights and freedoms of individuals,” it further warns.
The complexity and opacity involved in data-driven advertising also means internet users are hopelessly outgunned as their rights are systematically steamrollered. (Or as the ICO puts it: “The complex nature of the ecosystem means that in our view participants are engaging with it without fully understanding the privacy and ethical issues involved.”)
While you might think such a long laundry list of staggeringly massive rights violations should be more than enough for any watchdog to bring down the hammer and order the illegal practices to cease, the ICO is taking a different tack.
It’s creeping ahead cautiously — saying it wants to gather more data from the industry, perhaps issue another report next year, while also signaling to adtech companies that practices must change.
This is frustratingly contradictory — because the ICO also writes that it doesn’t believe the industry will change without a regulatory smackdown.
“Our work has highlighted the lack of maturity of some market participants, and the ongoing commercial incentives to associate personal data with bid requests. We do not think these issues will be addressed without intervention. We are therefore planning a measured and iterative approach, so that we act decisively and transparently, but also in ways in which we can observe the markets reaction and adapt our approach accordingly,” it says in the report.
“We intend to provide market participants with an appropriate period of time to adjust their practices. After this period, we expect data controllers and market participants to have addressed our concerns.”
The contrast between the view that it’s now putting out there — that massive violations of laws and rights are occurring — and yet more regulatory inaction means it is coming in for some major flak from data protection and privacy experts, who make the salient point that rules don’t exist unless they’re enforced. Nor indeed do rights unless they’re defended and upheld…
However, we need action. The next steps in this report need to be much more firm. AdTech is illegal in its current form: letting it continue undermines the GDPR in all sectors. pic.twitter.com/Ns9AQCB7bo
— Michael Veale (@mikarv) June 20, 2019
If the way how data-driven online marketing currently works is illegal at scale, then it needs to be stopped from happening. Now. Each day EU data protection authorities let it continue to happen this:
– further violates people’s rights and freedoms– totally undermines the GDPR
— Wolfie Christl (@WolfieChristl) June 20, 2019
Reached for comment on the ICO’s report, Dr Johnny Ryan, chief policy and industry relations officer of private browser Brave — and also one of the individuals behind the original RTB complaints — told us: “The ICO’s report recognises the data protection issues that we raised back in September last year. This is a useful confirmation of what was already clear. However, there is an urgent need for action now to prevent the identified illegality that undermines the privacy and data protection of every person using the internet, the regulator must now take action.”
We’ve reached out to the IAB and Google for comment, but at the time of writing neither had sent a response to the report.
The ICO’s earlier Technology Strategy planning document highlighted the risks posed by data-driven advertising. It followed that by making interrogating adtech practices a regulatory priority — hence today’s update.
Attention has also been concentrated on the sector since GDPR came into force by privacy and rights campaigners filing complaints about the legality of behavioural advertising.
In May the Irish DPC announced it had opened a formal investigation into Google’s adtech, after an initial assessment of an RTB complaint filed in Ireland.
It’s likely the ICO is taking a wait and see approach now to await the outcome of the DPC’s formal probe.
In its report the U.K. regulator does say it will “continue to liaise and share information with our European colleagues” — and also commits to “identify opportunities to work together where appropriate.” So there is likely co-ordination going on between the two DPAs.
There is also a hint of a solution in the report, when the ICO says it will “further consult with IAB Europe and Google about the detailed schema they are utilising in their respective frameworks to identify whether specific data fields are excessive and intrusive, and possibly agree (or mandate) revised schema.”
This sounds like it’s coming round to the view that online advertising doesn’t need masses of personal data to function — but can in fact be targeted contextually, delivering ad clicks while simultaneously protecting individuals’ privacy and fundamental rights.
A view that some online publishers also share. (Also relevant: Revenues generated by the current structure of the adtech market disproportionately flows to the tech giant duopoly of Facebook and Google, whereas publisher revenues have not enjoyed massive growth…)
“We understand that advertisements fund much of what we enjoy online. We understand the need for a system that allows revenue for publishers and audiences for advertisers. We understand a need for the process to happen in a heartbeat. Our aim is to prompt changes that reflect this reality, but also to ensure respect for internet users’ legal rights,” writes information commissioner Elizabeth Denham .
“The rules that protect people’s personal data must be followed. Companies do not need to choose between innovation and privacy.”
Not just that, there’ll be innovation in contextual advertising through NLP and website analysis that is yet untapped. Expect that -4% to turn into a +??%. https://t.co/FLCizX2I66
— Michael Veale (@mikarv) June 20, 2019
(For context on the -4% figure cited in the above tweet see here.)
Update: Townsend Feehan, CEO of the IAB Europe, has now sent the following statement responding to the ICO’s assessment of mass scale non-compliance with data protection rules:
IAB Europe welcomes yesterday’s ‘Adtech Update Report’ issued by the UK Information Commissioner’s Office (ICO). We appreciate the ICO’s measured approach and focus on understanding the practices of, and engagement with, the advertising industry as expressed in the report. We look forward to working with the ICO over the coming weeks and months to continue to educate the ICO on the industry’s practices, identify and address its concerns, and drive the industry in a positive direction toward a standardised solution.
The ability to address the ICO’s concerns is near impossible to achieve without a standardised industry solution and we share the ICO’s aim that parties operating within digital advertising can continue to operate responsibly and in compliance with relevant laws, to ensure the sustainability of this innovative sector which underpins the ad-funded internet.
We also welcome the opportunity to clarify some of the misconceptions in the report’s description of the features and functionality of the Transparency & Consent Framework (TCF). The TCF provides a common framework to facilitate compliance with certain of the requirements of the ePrivacy Directive and the GDPR for every part of the advertising value chain, from publishers and technology companies through to agencies and advertisers. In addition, the TCF ensures publishers and advertisers can provide users transparency and choice about the processing of their personal data while continuing to maintain choice in the technology companies with whom they wish to work.
The Content Taxonomy provides nomenclature for categorizing content. It can be applied by publishers and other companies in conjunction with OpenRTB – a communication protocol supporting real-time bidding – and other technologies to allow for better placement of advertising alongside editorial, notably including avoidance of ads for content falling into sensitive categories. Companies choosing to implement the OpenRTB protocol and Content Taxonomy are responsible for ensuring that any personal data they pass or receive complies with the privacy laws and restrictions of their jurisdiction. This is similar to a companies’ use of any similar technology, such as HTTP or Wi-Fi.
The IAB Europe Policy team and I will be working closely with the ICO – as we have with other regional Data Protection Authorities (DPAs) – and this ongoing dialogue will inform any future iterations of the TCF, to strengthen its ability to support the industry in mitigating privacy-related risks, so online users have confidence and trust in how their data is being used.