The Government still has to lay it before Parliament. “Once the code has been laid it will remain before Parliament for 40 sitting days. If there are no objections, it will come into force 21 days after that. The code then provides a transition period of 12 months, to give online services time to conform.” So we have to wait until about April 2021 before we can be sure apps and platforms are designed with children in mind.
You can act now
There is no delay in us parents and grandparents keeping a close eye on what children are seeing today and everyday. 5 Rights Foundation explains the key features which the Code is based upon.
Useful to understand
1.’A child is defined as a person under 18 years of age’.
Currently, child-specific data protections more-or-less vanish when a child is 13 years old in the UK . Do you think a child is old enough to consent to the processing of all their data without any protection at 13? I don’t.
Happily, the Code asserts that all under 18’s are entitled to some protections for their data, which reflect the child’s age and development needs. So, while all under 18s are entitled to protection, that protection doesn’t need to be the same at all ages.
2. ‘Likely to be accessed’ by children
The Code applies to all services that are ‘likely to be accessed’ by children, which means it covers all the services children use in practice, not just those that are designated as ‘directed to children’. Service providers will have to demonstrate that they don’t have child users if they are challenged.
Children make up a third of internet users worldwide and spend the vast majority of their time on services that can’t be neatly categorised as ‘directed to children’. But as we see more buggies with smartphone watching babies and toddlers enjoying songs and learning games on tablets we know that our 15million under 18’s in the UK are likely to be positively affected by the arrival of the Code in 2021.
3. ‘Automated recommendation of content’
The Code is clear, concise, and ground-breaking in what it tells online companies they MUST do:
‘If you are using children’s personal data to automatically recommend content to them based on their past usage/browsing history then you have a responsibility for the recommendations you make’.
‘ Data protection law doesn’t make you responsible for third party content’ Third parties are people who pass stuff around on eg Facebook or who post on such as Youtube or which follow you like cookies.
The Code ‘ …does make you responsible for the content you serve to children who use your service, based on your use of their personal data.’
This would tackle cases such as that of Molly Russell, the British teenager who took her own life in 2017 and whose family subsequently discovered that she had been repeatedly auto-recommended graphic self-harm and suicide content on social media platforms like Instagram and Pinterest.
4. Enticing children must stop
Companies are told ‘Do not use nudge techniques to lead or encourage children to provide unnecessary personal data or weaken or turn off their privacy protections’.
The Code says ‘no’ to nudge techniques that might lead children to lie about their age when signing up to online services. But it does encourage nudges to help children protect themselves. The BBC has an app which helps users think about how their communication might be received https://www.bbc.com/ownit/take-control/own-it-app . So if angry or rude language is used it pops up with a question suggesting you think again. This is a positive nudge technique.
Remember being on the Internet is an open door often hostile environment
Always look beyond the first screens of any app and be sure what lies behind is OK
Get ready to tell the Information Commissioner of anything you think breaches the Code.