Why do websites revise their policies? Some websites do it due to external pressure from banking institutions threatening to remove access to financial services, while others do it to reduce the exposure to its younger users.
How effective is it when a social media website revises its policies to prohibit sexual solicitation and nudity? To be honest, it depends on the popularity of the site and the efficiency of the sites ‘Trust and Safety’ team to regulate these new policies. Let me expand on that.
Popularity of the site will determine whether users will want to adapt their behaviours to circumvent these new policies or decide to move to a different site with less stringent policies. If the site is ‘popular’, the effort to remain on the site will be worthwhile. This is when users, especially sex workers (content creators), will be required to evolve their activities/behaviours to promote their ‘content’ without the fear of having their accounts banned.
The efficiencies of the sites ‘Trust and Safety’ team will be determined by how they can identify these new behaviours. Without understanding how the policies are being bypassed, users will continue to use the site without the fear of having their accounts banned.
For example, let’s look at TikTok
TikTok was launched internationally in 2018 and experienced the fastest growth of any social media platform. As a result, TikTok is one of the top 10 social media sites globally with approximately 689 million active users.
TikToks community guidelines prohibit nudity and sexual activities, these include:
Content that explicitly or implicitly depicts sexual activities including penetrative and non-penetrative sex, oral sex, or erotic kissing
Content that depicts sexual arousal or sexual stimulation
Content that depicts a sexual fetish
Content that depicts exposed human genitalia, female nipples or areola, pubic regions, or buttocks
Content that contains sexually explicit language for sexual gratification
The guidelines would suggest that there is no nudity or sexual activity on the site; however, this is not the case.
Content creators have been required to adapt how they promote their services using vocabulary, emojis and videos that aren’t (yet) being identified by the sites ‘Trust and Safety’ team. Once the ‘Trust & Safety’ team becomes aware of these behaviours and install new safety measures to mitigate them, the users will evolve their methods or move on if the site loses popularity.
Content creators currently use emojis that can be used for sexual innuendo or dual purpose. For example, most people will be aware that the aubergine/eggplant emoji is used to denote a penis; however, many may be less aware of the use of orange and black hearts/boxes, which signifies the website Pornhub.
Vocabulary gets altered so that words like Sex Worker, Sex, Nudes have been replaced with Accountant, Seggs, Noodles. The use of alternative sexual references is also being used like ‘railing’ and ‘pegging’. Railing is hard intercourse and pegging is where a male is anally penetrated by a female wearing a ‘strap-on’ sex toy.
Once a content creator has provided an indication that they are sex worker, users can access their profile where they will direct people to their Instagram account or provide a link to their social media landing page.
A platform that changes its policies to reduce the exposure of younger members will need to be aware of the evolution because if they don’t, they will be guilty of just brushing the problem onto a neighbour’s driveway. Users will still use their platform; however, it will be to direct people to a third-party platform with less regulation. Less regulation means that young people continue to be exposed to material and the wellbeing of content creators is potentially reduced.
What about those platforms that have external pressures placed on them by financial institutions? They have two options, firstly to comply, self-regulate and remove the non-consensual and de-humanising content. The second option is to obtain alternative payment methods thus mitigating the threat from traditional financial institutions. Both these options have consequences….
Does the demand for non-consensual and dehumanising material stop because it has been removed from a site? The answer is simple - no. if there is a demand there will be supply.
A site that obtains alternative payment method cannot be threatened by financial institutions and therefore will not be required to self-regulate or comply. Oversight and regulation is lost and, as previously mentioned young people continue to be exposed to material and the wellbeing of content creators is potentially reduced.