Society and communities
The battle against children’s unsupervised screen time
The current lockdown continues to be challenging for children and carers. Home-schooling is stressful and difficult, especially for working parents struggling to keep children engaged, happy, motivated and - of course - educated, while working from home.
An added difficulty to this scenario is keeping children focused on their homework for an average of five to six hours a day, five days a week, while introducing short, but frequent, unsupervised breaks.
Balancing screen time during lockdown is challenging because school-related activities are now mostly delivered online. Schools engage with carers and students via emails, apps like MarvellousMe, and school websites offering new weekly links to resources. Children also connect with their friends online through social networking services. Play time and entertainment happens online too. Most of this screen time happens unsupervised while carers are trying to maintain their work life from home.
Unsupervised breaks often include a visit to TikTok or to free-to-access online games bombarded with unwanted advertising. Many children sign up to online services at a younger age than the minimum age requirement (often 13 years old). Parents often wish they had more control on their children’s use of technology and the content consumed, but struggle with their own unfamiliarity with technology. Given that adult knowledge on data literacy is often lacking and attitudes are so varied and context specific, how can children be expected to navigate these practices?
Lockdown is bringing to the forefront issues around exposure to inappropriate content, over engagement with technologies, and the inability of children (and adults alike) to disengage at will. This is a rather unfair situation in which online children’s rights may not be respected. It is well known that the persuasive elements of technology (e.g., notifications, alerts, pings, eye catching interfaces) make it almost impossible for children to turn off their devices. With the sole intention to keep users’ attention fixed on digital devices and maximise the chances of collecting and monetising more personal data, online companies seem to not care if personal data is coming from adults or minors, unaware of the consequences of online profiling for targeting advertising.
Children are usually well versed on internet safety strategies, which often focus on ‘stranger danger’ and threats based on other people, rather than platforms or data collection. The UK school curriculum covers online safety in terms of protection against individuals, such as threats of “sexual predation, online bullying and harassment”, but does not cover the use of personal data by online platforms, or how children and young people can have self-efficacy and protect themselves from the corporations whose business models rely on the monetisation of personal data. Children are therefore able to talk about, for example, not sharing their location with others (by changing privacy settings and so on), but do not consider how to take more control of their personal data from the platforms themselves.
Regulation and compliance
An amendment to the UK Data Protection Act 2018, championed by our research at the University of Nottingham and 5Rights and led by the Information Commissioners Office (ICO), has resulted in the Age-appropriate design code. The code, which comes into effect by the autumn of 2021, sets out 15 standards to protect children from corporations rather than people. The standards include a code of practice and online services should demonstrate that they comply with the code, for example by only collecting the minimum amount of personal data needed to provide the service, setting notifications ‘off’ by default, and not disclosing children’s data unless a compelling reason to do so can be shown.
The code is an effort to ensure that the digital world keeps developing with children and young people in mind. Whilst there are other guidelines for the ethical treatment of children online, several recent reports suggest that there is more to be done to ensure an internet fit for children, both in terms of regulation and over engaging design.
A recent report by the Royal College of Psychiatrists, Technology use and the mental health of children and young people, highlights the need to prioritise the strictest enforcement of the age-appropriate design code to services that both target or are popular among children. Under this enforcement, services should be set to the highest privacy levels by default and assume that users need child protection until explicit action is taken to opt out.
As children grow up surrounded by more and more technology, they have to be part of the discussions around online data protection. Their education has to go beyond e-safety and extend to more media literacy awareness across all school stages. It is important to address both the risks and opportunities, create insight and awareness of personal engagement with technologies and encourage agency. Data literacy awareness is crucial, not only for children and young people at all stages, but for teachers, parents and primary caregivers. We are all responsible for promoting children’s digital rights around the dramatic increase in the extent of personal data collection by online companies and its impact on personalised content and consequent over engagement.
The Code, once enforced by law, will be a step forward to win the battle against children’s unsupervised screen time, so carers can be reassured that the content consumed is appropriate and time spent online is easily self-managed.
Elvira Perez Vallejos
Dr Elvira Perez Vallejos is Associate Professor of Digital Technology at the National Institute of Health Research Biomedical Research Centre, Institute of Mental Health. She collaborates with Horizon to investigate the risks and opportunities that the digital world offers to our mental wellbeing.