Tech giants have had a year to ensure their platforms make protection of children’s data a priority or face enforcement action, including fines.
Now the deadline for complying has been reached, firms must follow 15 standards set out by the Information Commissioner’s Office’s (ICO) Age Appropriate Design Code – but what are they?
1. Best interests of the child
This should be a primary consideration when designing and developing online services likely to be accessed by a child.
So firms will have to consider how to keep children safe from exploitation risks, and support their health and wellbeing, among others.
2. Data protection impact assessments
Firms should “assess and mitigate risks to the rights and freedoms of children” who are likely to access an online service, which arise from data processing.
They should take into account ages, capacities and development needs.
3. Age-appropriate application
A “risk-based approach to recognising the age of individual users” should be taken.
This means companies should establish the age range of the individual user, so that protections and safeguards can be tailored.
Privacy information provided to users “must be concise, prominent and in clear language suited to the age of the child”.
5. Detrimental use of data
Children’s personal data must not be used in ways that have been “shown to be detrimental to their wellbeing, or that go against industry codes of practice, other regulatory provisions or Government advice”.
6. Policies and community standards
Companies must uphold their own published terms, policies and community standards.
7. Default settings
Settings must be set to “high privacy” by default.
8. Data minimisation
Collect and retain “only the minimum amount of personal data” needed to provide the elements of the service in which a child is actively and knowingly engaged.
Give children choices over which elements they wish to activate.
9. Data sharing
Children’s data must not be disclosed, unless a compelling reason to do so can be shown.
Geolocation tracking features should be switched off by default.
An “obvious sign for children when location tracking is active” should also be provided.
Options which make a child’s location visible to others must default back to “off” at the end of each session.
11. Parental controls
Children should be provided age-appropriate information about parental controls.
If an online service allows a parent or carer to monitor their child’s online activity or track their location, provide an “obvious sign to the child when they are being monitored”.
Profiling is used for things such as advertising and the code says that these should be switched off on accounts belonging to children by default.
It will only be allowed if there are “appropriate measures” in place to protect the child from any harmful effects, such as content that is detrimental to their health or wellbeing.
13. Nudge techniques
Do not use nudge techniques to “lead or encourage children to provide unnecessary personal data or weaken or turn off their privacy protections”.
This means things like a pop-up asking whether a person wishes to proceed, making the “yes” button overly prominent while the “no thanks” button is much smaller.
14. Connected toys and devices
These should include effective tools to ensure they conform to the code.
15. Online tools
Children should be provided with prominent and accessible tools to exercise their data protection rights and report concerns.