Joe Gray /

Broadcasting to offenders: Where are the safeguards?

4 min read
Image of four TVs, one with the logo for Facebook Live on the screen, another with YouTube Live, another with Instagram Live, and another with the words "livestream".
Scroll down

As the Covid-19 continues to disrupt our ‘once’ daily routine of work and school, one of the real negative impacts we have observed from the pandemic is the increase in online offending. One of the most prevalent areas that offenders are ‘hunting for their prey’ is on live streaming platforms. For those that aren’t familiar, live streaming platforms provide an individual the ability to create and record a live broadcast simultaneously. Some of the most popular live streaming platforms are Facebook Live, Periscope, Twitch. LiveMe and Bigo. 

Live Streaming is huge, here are a few statistics just to try and gauge the enormity:

In 2018, the number of Facebook Live broadcasts reached 3.5 billion

Periscope streams approximately 350,000 hours of videos per day. If a person were to sit down and watch them all continuously, it would take them 40 years

In 2020, Twitch amassed 889 billion minutes of broadcasts watched since its launch. In February 2020 there were 3.8 million unique broadcasts.

Offenders are utilising live streaming platforms to commit Live Distance Child Abuse (LDCA). LDCA can be split into two distinct categories, these are financially and non-financially incentivised. Both categories provide similar services; however, the former requires payment to be made prior to any activity taking place. Essentially it is pay-per-view, where a victim is forced to undertake a sex act and is predominately associated with developing or 3rd world countries.

Non-financially incentivised LDCA is when a victim broadcasts ‘self-abuse’, without requiring financial compensation from the viewing parties. This type of offending is occurring on the popular ‘social’ live streaming platforms. The live streaming is usually conducted on a mobile device, although most services are now available on desktop, and is done in the comfort of the home, bedroom or bathroom.

Offenders lurk on these platforms searching for children broadcasting and then attempt to ‘groom’ them into self-abusing. Offenders are aware that children crave, almost need attention, it’s called belonging. Children want to be popular and liked, they relate that to the number of followers they have or the number of likes they receive for a post/image/video. Live Streaming platforms offer that with hearts, gifts and followers. Offenders coerce children by offering to increase their number of followers, provide them with compliments or induce them with financial incentives. 

Statistics indicate that a large percentage of children want to become internet famous, whether it be as an influencer or a social media celebrity, as this type of career is now regarded as an attractive and viable option. This along with the desire for attention and belonging provides offenders with ideal opportunities.

The offending will start with fairly innocent suggestive comments and often will involve attempting to obtain the child’s social media accounts, like that of ‘Insta’ or ‘Snap’ (Instagram or Snapchat). This is where the real offending takes place, in the privacy of a private broadcast, group or DM (Direct Message) away from the prying eyes of others.

Offenders will share the details of their victims or potential victims with each other so that they can work together to make requests repetitiously until they have achieved their goal. 

Why is it so easy for offenders to find vulnerable children and commit offending on these popular platforms? The first answer is that they aren’t necessarily getting the child to self-abuse in the public broadcasts anymore. Experience has shown that this will result in their accounts being suspended and a criminal report being made to Law Enforcement (cybertip report). The second answer, which is linked to the first, is that the ‘Trust and Safety’ teams for the service providers are unaware to the techniques and modus operandi of the ever-evolving offenders. With the best will in the world, unless you have experience in investigating this type of offending you won’t necessarily know what to proactively look for and will only become aware when it is potentially too late.

A popular technique used by offenders to help identify each other, on these platforms, is the displaying of specific emojis and taglines. The use of emojis, in itself, isn’t an offence as everybody uses them; however, there is an alternative meaning to a number of legitimate emojis which provides a clear indication of sexual preferences to other offenders.

Once offenders have identified each other, they can create private groups to share information or child sexual abuse material safe in the knowledge that they won’t have their accounts shut down because service providers are ignorant to their techniques and they are not breaching terms and conditions.

How do we tackle this issue? A good start would be for Service Providers to improve their ‘Trust & Safety’ capabilities and better co-operation between industry and Law Enforcement to ensure that when new trends and techniques are identified they can be shared to mitigate the number of children becoming victims.