Updated May 2020.
Last month I worked with Coadec, the policy voice of Britain’s tech startups, to lead on the formulation of their response to the ICO’s draft Age Appropriate Design Code. I wrote this draft document as a roadmap for the work, although what follows here are very much my own personal thoughts, not Coadec’s.
The draft Age Appropriate Design Code puts the startup ecosystem in the UK at risk. It will mean age-gating across the internet, huge amounts of data collection, & create an internet for kids designed by tech giants. Read our letter to @ICOnews here: https://t.co/G6HMmuQTHU pic.twitter.com/LFZ3TmKw2d
— Coadec (@Coadec) May 31, 2019
I would like to explain – as a policy wonk, a technologist, a privacy campaigner, and as a parent – why the draft Code is one of the worst proposals on internet legislation I’ve ever seen.
What is the Age Appropriate Design Code?
The ICO, the UK’s data protection regulator, has published a draft Age Appropriate Design Code for public consultation. This has been done under an amendment to the Data Protection Act 2018 requiring it to “prepare a code of practice which contains such guidance as the Commissioner considers appropriate on standards of age-appropriate design of relevant information society services which are likely to be accessed by children.”
The ICO is presenting the draft Code’s requirements as a GDPR matter, a task mandated by GDPR, or a requirement for GDPR compliance. It is not. The draft Code is an amendment to a piece of domestic legislation (Amendment 123), but was not requested, mandated, or required by Europe in any way, shape, or form. Yet because the Data Protection Act is the domestic implementation of a European law, the requirements presented in the draft Code will create a substantial shift and amending of the Data Protection Act which will move the UK completely out of step with Europe.
This will create the ironic situation where the ICO will be responsible for leading the UK’s regulatory divergence away from GDPR after Brexit.
Who must comply with the Code?
The draft Code would be applicable to all under-18s in the UK, regardless of whether the site or service they are accessing is based within the UK, and would apply to any information society service likely to be accessed by children. This means any site, app, service, or product which could conceivably be accessed by a British child at any time, even if it is not targeted at children.
The draft Code states that “if your service is likely to be accessed by children but you don’t know which users are children, you must apply the code to all users.” So you – yes, you – have a simple choice: overhaul everything from the ground up in case children might access it, age gate every user to identify the children, or find out how many children access the service by collecting data on all users.
This is a staggering alteration to the fundamental right to access the internet – and to the business models of every company doing business online in the UK. After all, the only way to determine if an ISS is being accessed by young people is by collecting data about them.
This mandates every business with an online presence which could conceivably be accessed by an under 18 at any point in its lifecycle to engage in personally identifiable corporate surveillance of all users.
Extraterritorial applicability
Non-UK businesses targeting UK users will be required to comply with the Code. Of course, the US tech giants whose abuses have largely inspired this Code will find the compliance requirements – and costs – easy to bear. For other non-UK businesses, many of whom are still reeling from the compliance process for GDPR while bracing for the incoming US Federal privacy legislation, there is only one logical option: they will block UK users altogether rather than be compelled to invest time and money into a nigh-impossible compliance process which, appallingly, equivocates their noncompliance with child exploitation.
At a time when the UK tech sector is subject to no less than eight ongoing regulatory processes, a run of Ministers for Digital who openly despise the tech sector, and the catastrophic impact of Brexit, the draft Code has been written in a way that sends the most hostile message possible to the tech sector both here and abroad. If the goal of the draft Code is to trigger an exodus of tech businesses and investment, it will succeed.
When is a standard not a standard?
The draft Code, despite discussing “code standards”, provides no actual coding standards – such as technical guidance – on how its provisions should be achieved. The actual drafting and interpretation of the Code’s provisions will be outsourced to you – yes, you – to solve on your own time and expense.
If the UK truly wants to lead the way in online child protection, it should do so on a global scale through open standards, not on a national level through closed coregulation. The Digital Charter’s vision of Britain making the internet safer would be better achieved by collaborating with working groups and standards bodies, such as the WC3, to draft actual open technical standards, similar to the WCAG standards for people with disabilities.
As it stands now, the draft Code is a blueprint for a repeat of the cookie law fiasco, mandating you to deploy poorly designed, hostile, and counterproductive verification processes which will do nothing to serve the purpose for which they were intended or protect the people they were meant to help.
How much is this going to cost you?
Crucially, the draft Code makes no mention of the costs which businesses can expect to incur to come into compliance.
That goes farther than administrative oversight. There has been no economic impact assessment carried out whatsoever.
If the ICO were inclined to try to claim that the Code was covered under the GDPR or Data Protection Bill impact assessments, they would be wrong. The initial government impact assessment on the Data Protection Bill says nothing about the AADC. (Well of course it wouldn’t – the AADC was a Lords amendment added halfway through the legislative process.) The wider government general impact assessment says nothing about the AADC either. All discussions of children there related to the transposition of GDPR Article 8 (children’s consent) and the definition of that age of consent, even in an age verification context. The sixteen provisions of the AADC, which are a completely separate matter, are nowhere.
And, of course, there would be no mention of the AADC in the EU’s GDPR impact assessment, as the AADC is a wholly domestic idea not mandated or regulated by Europe in any way.
Those unknown compliance costs to UK and overseas businesses, which will be substantial, will also be a means of outsourcing the costs of the ICO becoming the UK’s de facto child protection regulator to the private sector.
What are the draft Code’s standards?
The consultation proposes sixteen areas which you must take into consideration.
Best interests of the child
This section tasks you – yes, you – with the following responsibilities towards children:
keep them safe from exploitation risks, including the risks of commercial or sexual exploitation and sexual abuse; protect and support their health and wellbeing; protect and support their physical, psychological and emotional development; protect and support their need to develop their own views and identity; protect and support their right to freedom of association and play; recognise the role of parents in protecting and promoting the best interests of the child and support them in this task; and recognise the evolving capacity of the child to form their own view, and give due weight to that view.
While there absolutely are steps that we all can and should take to play their roles in supporting those processes, this section tasks businesses with so many fundamental obligations over the personal health and wellbeing of children that it amounts to a demand for corporate co-parenting.
Age-appropriate application
To make the world a better place for children and young people, you – yes, you – will have three options:
- Apply the Code for all users as a default, meaning assume all users are adults, therefore limiting or blocking an under 18’s ability to access or use a service;
- Implement an age gate collecting personal data about all users, and develop multiple variants of the product or service for five defined age bands, ranging from infancy to the cusp of adulthood, in addition to an ‘adult’ version;
- Implement an age gate collecting personal data about all users, to let in adult users and and block younger users from accessing the service altogether.
To implement the age gate, all businesses within the Code’s scope will be required to collect age verification data, such as a passport or credit card, for all users, and to process and retain it in full accordance with GDPR. This age gating will render all internet usage access in the UK personally identifiable to an individual, creating massive private databases of personal internet access.
This section is nothing less than a mandate to lock the entire British internet – every site, service, and app – behind an age-gated surveillance system.
Demanding “robust age-verification mechanisms” of all users fundamentally alters the structure of global internet governance while outsourcing the compliance costs to UK businesses. This is well beyond ICO’s purview.
Oh, and by the way: the development of mandatory age gating will mean businesses being reliant on the only companies which will have age verification services available for cash-tight agile businesses to deploy within a three-month compliance window: adult entertainment giants. The biggest winner from the Code will be the porn industry.
Transparency
This section requires you – yes, you – to impose adult concepts such as GDPR-standard privacy policies onto five broad age bands, including the tiniest toddlers. For the youngest children in the earliest stages of personality development, issues of data privacy will not be seen as such; they will be seen as issues of safety.
I am incandescently concerned that this section conflates issues of general data protection rights with issues of immediate personal safety and risk, and tasks every business with an online presence to implement that conflation as a code of practice.
The draft code, for example, suggests that developers should “provide audio or video prompts telling children to leave things are as they are or get help from a parent or trusted adult if they try and change any high privacy default settings” – for children under the age of five. Imagine a tiny child hearing a strange voice urging her to tell an adult to go get help.
Likewise, the draft Code provides a suggested mockup of a consent window which says “If you don’t understand or aren’t sure about this then you should leave the setting as it is, and we won’t use your information in this way.” This is a extraordinary amount of self-doubt and uncertainty – look at that treble negative! – to introduce into a six year old’s thought process, over (for example) an app based game they might play for ninety seconds.
The text noting that “If your online service includes a physical product, for example a connected toy or speaker you should include the icon on your packaging, highlighting online reporting tools as a product feature, and find ways to highlight reporting tools in a prominent way even if the product is not screen based” mentions nothing about the primary reporting tool in that situation: the parent or adult.
These draft requirements for transparency are a recipe for creating a culture of fear. This is not a role you should be compelled to play.
Detrimental use of data
This section asks you – yes, you – to not take actions which go against industry codes of practice, other regulatory provisions or Government advice. This is a monumental swathe of guidance to consider in defining detriment, most of which has no legal basis. Additionally, the concept of “detriment” is itself the subject of a separate consultative process (the online harms white paper).
Policies and community standards
This section asks you – yes, you – to be more proactive in publicising terms, policies and community standards, but does not say what they are.
The draft notes “If you make commitments to users about the content or other aspects of your online service then you need to have adequate systems in place to ensure that you meet those commitments. So if you say that the content of your online service is suitable for children within a certain age range then you need to have systems in place to ensure that it is.” This conflates systems, meaning internal processes, with policies, meaning public-facing statements. And it does so in a way which could be abused as a means of compelling you to disclose proprietary business information or violate non-disclosure agreements.
Default settings
This section largely repeats the work that you already did ahead of GDPR.
Data minimisation
This section largely repeats the work that you did ahead of GDPR. However, the ICO is not cognisant of the contradiction inherent in requiring businesses not specifically targeting children to nevertheless collect data about children’s usage, and their identification as children, to determine whether that site or service is being accessed by them. In other words, to minimise data collection about children, you must maximise it, to the extent that it becomes a primary business activity.
Data sharing
This section, while largely repeating GDPR’s provisions, shifts the concept to an intangible measure of “general wellbeing” rather than one of tangible data protection compliance – a measure it is not for you to determine.
Geolocation
The safe, responsible, and legal leveraging of location data carries a huge consumer benefit to both adults and young people, and is helping innovation to flourish. I was therefore appalled to note that this section associates the use of location data with “abduction, physical and mental abuse, sexual abuse and trafficking.”
This hysteria could lead to young adults being infantilised under rules prepared for toddlers; rules which could, for example, ban them from being able to use a car share app to get home because it uses geolocation data.
Parental controls
This section mandates parental controls from a peculiarly legalistic standpoint, compelling adults to prioritise data protection frameworks over their own parental instincts. For example, within the suggested guidelines for information across age bands, there is a preoccupation with “provid{ing} parents with information about the child’s right to privacy under the UNCRC”. Surely, advising parents to respect their children’s right to privacy as a matter of supporting their growth, development, and safety is a more reasonable approach, as opposed to mandating an explainer on parents’ role in helping the ICO to uphold international treaty obligations.
(Apparently not: at a panel on “Safeguarding Dystopia” which I attended at ORGCON 2019, a representative from UNICEF spoke for over ten minutes about the UN and the UNCRC’s primacy in the age appropriate design code. It was, if nothing else, a good way to stall for time so that difficult questions couldn’t be asked.)
The suggestions on parental controls for all five age bands also propose that you should “provide a clear and obvious sign that indicates when monitoring or tracking is active”. Has consideration has been given to this requirement causing “parental monitoring blindness” – akin to cookie consent popup windows, something so ubiquitous it becomes ignored?
That’s all aside from the obvious implications of children over the age of 13 transitioning to adulthood under the watchful eye – perhaps a literal icon of one – of a constantly activated parental monitoring signal, and how this is in fact a violation of their personal privacy. How did their parents ever grow up without one?
Profiling
The section on profiling is an alphabet soup, encompassing GDPR, PEGI, CAP, >18s, the advice of the CMO, and T&Cs on UGC. It compels you to take a dense and legalistic approach to one of the most critical threats to childrens’ wellbeing – algorithmic curation and profiling – they currently face.
Additionally, this section states that profiling is allowed if measures are in place to protect the child from “any harmful effects”, which ties in with the online harms white paper’s discussion on content/activities which are legal/harmful. As this Code would precede the outcome of the white paper consultation, this is an instance where the code would need to establish “harm” rather than following it, a guaranteed recipe for deviation from the actual final definition.
Nudge techniques
“Nudge techniques” has been misinterpreted, buzzword-style, as an inherently negative feature. Nudge techniques can be used positively, for example, to encourage children to protect their data, turn on their privacy settings, not disclose unnecessary information, and so forth.
The ICO should use more constructive phrasing, suggesting positive nudge techniques which would be deemed acceptable, while citing specific negative techniques and dark patterns which would be deemed unacceptable. Until then, this section risks throwing out the baby with the bathwater – literally.
Connected toys and devices
This section states “If you provide a connected toy or device, ensure you include effective tools to enable compliance with this code”.
What are those tools, who is responsible for creating them, and how will compliance be evaluated? Businesses accessing a Code of Practice expect to be given answers to questions like these within the code, not open ended questions to solve on their own time. The best answer to that question – the DCMS Code of Practice for iOT consumer security – is linked to as an external reference at the end of the section, but not discussed anywhere within it.
Related to that ambiguity, this section of the draft Code discusses practical questions, such as “Provide clear information about your use of personal data at point of purchase and on set-up” and “Avoid passive collection of personal data”. These are checkpoints for product and service developers to use. They are not tools for young people and their parents to use.
Online tools
As with the section on Transparency, while this section is ostensibly about helping children to exercise their data protection rights and report concerns over the misuses of their data, there is a clear conflation of immediate personal safety needs with frankly advanced GDPR rights such as data portability.
For example, this section recommends that you create an “I need help” button for the tiniest children to discuss their data rights with an adult. I can assure you that no three-year-old will be pushing that button to file a subject access request.
Data protection impact assessments
At first glance, this section may seem to be a duplicate of guidance on DPIAs which ICO provided ahead of GDPR. However, this section suggests several new and specific questions regarding children, and requires businesses whose services could possibly be accessed by one to do a child-specific DPIA. This effectively moves the goalposts for businesses which have already used DPIAs, whether their services target children or not.
As it stands, this section mandates fresh DPIA processes for everyone.
Governance and accountability
As with the section on Policies and Community Standards, this section asks you to provide more clarity on internal procedures without stating what kinds of processes will be mandatory and expected.
The bottom line
As children grow, evolve, and approach adulthood, every day of their lives is a delicate exercise in redefining trust as they create their own identities and worldviews. As it stands, this draft Code will mean that our children will grow up sheltered, shattered, and shamed, as their outlooks and formative experiences are shaped under a mountain of warnings, age gates, and surveillance – all of which must, by law, be crafted to send a clear message that the world is not to be trusted, and neither are you.
And we – yes, we – will be expected, on the presumption of our guilt, bad faith, and complicity, and under grave warnings of penalties, fines, and enforcement action, to be responsible for sending that message.
We all have a role to play in making the web a better place, as do all aspects of our society. This draft Code does not respect that role, nor does it respect you, nor does it actually respect the young people it means to protect.
By compelling you to engage in corporate co-parenting, the obliteration of young people’s privacy, and the lockdown of the entire British internet, the draft Code is a blueprint for creating insecure children and stunted adults who are unable to mature away from their parents, safely discern their environments, or trust their own judgement.
For that matter, it also co-opts you – yes, you – into building a decentralised surveillance apparatus over the entire British population in the name of “the best interest of the child”.
A better solution, grounded in technical standards, healthy cooperation, and practical applicability, must be found which can help us all to facilitate the safeguarding of young children, to provide young adults with the confidence to learn how to safeguard themselves, and to allow those approaching adulthood to deploy their safeguarding abilities in becoming functional adults.
The draft Age Appropriate Design Code, in its current form, does not present one.
Next steps
The final Age Appropriate Design Code was laid before the Secretary of State on 22 November 2019 despite the General Election. It is a finalised Code, likely to be presented as “comply or else YOU HATE THE CHILDREN”, rather than a further draft for further consultation. What’s in it? We don’t know. The final Code cannot be revealed publicly until there is a new Parliament to receive it.
When the final code – all 115 pages of it – was released in January 2020, there was a climbdown from the mandatory age verification of all content, for all users, called for in the draft. (Was it something I said?) Instead, the final calls for “age assurance”: owners of impacted sites, services, and apps can find alternate ways of verifying their users’ ages. By my read of the final version, that isn’t good enough. The final code still leaves open the possibility of it being used as a backdoor for a lockdown of all UK internet usage.
That, after all, was its intention all along. In a piece from January announcing the final code, the Times breathlessly announced – in a tone and phrasing suspiciously familiar to those of us who work behind the scenes – that “Under the Code’s 15 provisions, tech firms operating in Britain will no longer be able to hide behind the ‘we are platforms, not publishers’ excuse provided by US law.”
Huh? Wasn’t this supposed to be a “design code” about how to create better products and services? Wasn’t this not supposed to be a salvo in the online harms battle against American social media companies, or a means to normalise content moderation, filtering, and censorship now that we’re transitioning away from the EU’s e-Commerce Directive?
More fool you if you thought it was the former.
As of May 2020 there is a near-hysterical push from children’s rights groups, some of whom know full well what they’re really aiming for, to get the Code into Parliament and into law as soon as possible. It is worth noting that Parliament is under no obligation to rubber-stamp the Code, and can send it back or reject it. They would be wise to consider their options carefully, particularly given the continued absence of an impact assessment which takes into account the societal and financial impact of all possibilities from “age assurance” to mandatory age verification.
(Wise.)
Delighted to see @OliverDowden in @DCMS qs commit to requiring @ICOnews to rectify their omission and carry out a full economic impact assessment of the Age Appropriate Design Code, as we @coadec have requested since last year. Startups & SMEs shouldn't pay for tech giants' sins.
— Heather Burns (@WebDevLaw) June 4, 2020
Another thing Parliament should consider is the fact that the phrase “age appropriate” itself lacks a clear definition, and that it also has different contexts in different cultures. In the American data policy context, for example, the term “age appropriate” refers specifically to material which should undoubtedly be out of children’s reach, such as adult content and access to alcohol, tobacco, and firearms. The phrase does not refer to any content and material which a child may access on any conceivable topic. Yet in the UK context as Amendment 123 and the ICO have defined it, “age appropriate” means just that. Remember too that the Code will apply to businesses located outside the UK but which have UK users, which means that the code’s concept of “age appropriate” is likely to be a barrier to entry – if not a bona fide trade barrier in the upcoming negotiations – for those who may not be aware that the term here, as legislated, means something far broader than it does internationally. And that’s saying nothing for the equally subjective issue of online harms and its impact on the trade negotiations, which would be a much longer post in and of itself.
It is also impossible to consider the AADC without considering the context of the the bizarre, personality-centred media push its creators and supporters engaged in during the summer of 2019, and will no doubt do again. A little birdie tells me that one woman depicted in this article did not inform her organisation that she was doing it, which caused quite a bit of trouble for her at work on the Monday. The phrase “she’s gone rogue” was mentioned by the exasperated civil servants left to clean up after her.
In October 2019 the Children’s Commissioner for England – of that media push’s sorority – produced a sycophantic report to bolster the case for the Age Appropriate Design Code. The report was based on a laughably poor focus group of 29 children who were asked tilted questions to provide the answers the Commissioner wanted to hear to arrive at a foregone conclusion. Tellingly, one of its policy suggestions was that “the Code must be laid before Parliament as a matter of urgency”; the hallmark of bad policy is a determination to push it through as fast as possible with minimal scrutiny.
Perhaps more worryingly, in October the DCMS Select Committee reiterated its existing concerns about DNS over https – which it views as impediment to the Age Appropriate Design Code – in its questions to the Secretary of State for Digital (see question 636 in the transcript). Anonymous browsing, after all, is only used by suspicious people.
All of that is aside from the political curiosity raised by the collapse of age verification for adult content, and it being seen as a matter for the online harms framework to pick up. (The sorority was spitting nails here, folks.) Age verification lives in the Age Appropriate Design Code, not the online harms framework, but the code has been completely omitted from the postmortem dialogue. What might be going on there?
Whatever form the final Age Appropriate Design Code takes, it will be the tech sector’s harsh welcome to the sunlit uplands of the Conservatives’ vision for post-EU digital regulation, a framework which looks instead to the regulatory model which the Code’s primary driver openly admires: China.
(May 2020 note) Finally, to come full circle, I’m now a participant in a UNICEF working group which is creating standards for children’s data privacy. My homework on that has had me looking into the UN principles which the Code was allegedly based on. What I’ve learned from that is how far the Code strayed from those original principles. It’s been amazing, really, to take in the breadth of just how badly they managed to mess this up.
The Age Appropriate Design Code should have been just that: a design code, containing workable design guidelines for safeguarding children’s data privacy, full stop. It should not have been a backdoored weapon in government’s vendetta against American tech giants, a means to erode intermediary liability protections, or a tool to introduce content moderation, filtering, and censorship requirements by underhand means without an impact assessment.
Yet that’s exactly what it is.
The code comes into force on 2 September 2020 and becomes enforceable after a 12 month transition period.