Colm Gannon /

Algorithm Parenting – the computer says “No!”

5 min read
Photograph of a computer screen that has multicoloured computer code on it.
Scroll down

Algorithm Parenting – It’s a term I use to describe the excuses used by individuals who lay blame to Electronic Service Providers for the failure to have “algorithms” in place to stop young people from engaging in harmful online behaviour. Yes, there are times that the trust and safety requirements should and are lawfully required to remove, prevent and report illegal content but there are limitations to automation. Algorithms are based on current human behaviour and will be the subject of a readjustment delay to the evolution of human online interaction. 

Having worked at the coal face investigating online child exploitation in New Zealand and also child sexual abuse cases in Ireland, I understand the importance of robust online trust and safety protocols. My experience has enabled me to understand the negative impact created through the over reliance of safety algorithms.

However, as a society, we forget about the concept of “proportional responsibility”. When does the excuse become just that, an excuse? When does the use of blame, signal that we as a society are pushing away our civil responsibility to our children and young people? 

A recent global movement has been growing in relation to preventing young persons and children from accessing adult pornographic material. The global movement seeks to use a technological solution to age verify and block under 18s from accessing, either accidentally or by way of a decisive action, adult pornographic material. There are two important aspects to the reasoning behind “Age verification”; the requirement and the response.

Looking at the requirement to restricting pornographic (porn) material to under 18s, most people would believe that generally this is a good idea, even though some may not realise why. Research has been conducted by the Australian Institute of Family Studies, the Office of Film and Literature Classification [OFLC] and the Light Project. This research shows that young people are accessing porn, which is no surprise, however the drivers to accessing porn is an important focus of the studies. Some youth access porn as a sex education tool, some because of peer pressure and others believe they can use porn to identify their sexual orientation.

The biggest concern for our society that is highlighted in the OFLC research is that out of the 2000 young people surveyed, almost half of them (44%) will have engaged, or desired to engage, in a sexual act that they have seen whilst viewing porn.[pg 37]. Research under-taken relating to intercourse involving asphyxiation, identifies that 13% of sexually active girls between the ages of 13-17 have been choked during sex. [Herbenick. D et al, 2019, Feeling Scared During Sex,Journal of Sex & Marital Therapy, 45:5, 424-439]. The reasoned understanding that society seeks the need for restrictions relating to young people accessing porn, is that porn is having a negative impact on how sex is viewed [ref:] and there is a lack of ability for people (both youth and adults alike) to discern the difference between sexual acts on screen and in real life.

The response to this issue by Governments, is in my opinion a major concern. Age verification to adult pornographic sites! The question I pose, given that research has demonstrated that the rise in sexual aggression in society can be attributed to high consumptions of porn [Wright. P et al, 2016, A Meta-Analysis of Pornography Consumption and Actual Acts of Sexual Aggression in General Population Studies, Journal of Communication, Volume 66, Issue 1, February 2016, Pages 183–205], will age verification lead to the protection of society?

A news article published in New Zealand, highlights that the NZ Government is exploring the idea of using an age verification process for users to access porn[4]. This model is similar to the UK who have agreed in principle to its implementation through the Digital Economy Act 2019. Looking at the problem, this is not a distributed technology issue alone. We must also consider the “proportional responsibility” model. Service providers can incorporate Trust and Safety features into their products – like that of Pornhub who have ensured that their website is RTA [Restricted to Adults] compliant. At certain stages, Parents and Caregivers have to take responsibility for the devices that their children and young people are using. They need to become digital literate, have access to assistance from Government services, have access to resources provided by their mobile and internet service providers and more importantly, have access to a State co-ordinated support service, 24/7, 365 days a year.

Age verification, still does not resolve the issue of young people accessing porn. The use of other technology to distribute images or video files has resulted in the development of terms like “Cyber Flashing”. The use of Virtual Private Networks (VPNs), to circumvent security controls, by young digital users is viewed as a “norm” behaviour and will be a simple and effective solution to counteract the use of an age verification protocol. The proposal has the potential for young people to engage in identity theft, or just like the “VHS era”, it may encourage young people into sharing or selling access to adult porn.

Our Governments need to realise, whilst this is a complex issue, it is an issue that can be easily addressed. Young people want to see change. Adults want to see change. In reality, society demands a change. Governments need to understand that the online platform is not to blame, it is the choices people are making. Digital advertising algorithms provide options based on a user’s choice. The algorithms are creating prompts to access porn based on the users interaction with porn. The algorithms are based on the users’ online habits, resulting in a the user being pushed into a “digital echo chamber”. “Proportional Responsibility” empowers users to make the right decision. The values we teach as a society are reflected by the functioning members of society. Consent, relationships, tolerance, anti-violence, harm reduction and one of the most important; the confidence to ask for help. 

Algorithms will not produce the moral compass of society. Tech companies can create the greatest and most robust safety protocols, however, the question still remains; if we as a society do not want to talk about the problem and deal with the hard issues, are we complicit, in conjunction with the safety policies being regulated upon us, in “Algorithm Parenting”?