Joe Gray /

Social Media & Pornography (Part 2)

3 min read
Illustration of a carousel of seven smartphones. All phones show a drawing of a woman crying in a shower and the middle and biggest phone shows a woman standing and has a flashlight shining down on it. In front of the smartphone are two credit cards with a red cross over them
Scroll down

Recap: Part One discussed the following issues:

  1. The use of social media sites, by content creators, to share and advertise pornographic material

  2. How content creators adapt when Trust and Safety teams amend their community guidelines to prohibit their site being used to share pornographic material.

  3. The use of third-party platforms with minimal, to no, regulation and the demand for non-consensual and dehumanising material.

Part Two will discuss the consequences of financial institutions putting pressure on adult pornographic websites and the increase in demand for non-consensual and dehumanising content.

Over the last few years, society has seen an increase in the demand for dehumanising material due to the social normalisation of problematic and harmful acts, like that of asphyxiation and defecation. Also, studies have shown that young people are using pornography to educate themselves on sexual behaviours, which has led to sexual objectification and violent attitudes towards genders.   

In December 2020, Mastercard and Visa stated that they would be blocking their customers from making purchases on Pornhub using credit cards after accusations that the website hosted non-consensual and ‘dehumanising’ videos.

This statement was in response to enhanced media scrutiny which also drew the attention of government, and non-governmental organisations.

What are the consequences when financial institutions threaten to block their credit cards being used on websites and/or when government, and non-governmental organisations focus their attention on platforms?

In Pornhub’s instance this enhanced scrutiny, along with their business model, has led to the development of enhanced trust and safety models via self-regulation and mutual co-operation with governments, law enforcement, non-governmental organisations, and academia.

Despite websites, like Pornhub, increasing their self-regulation and engaging in campaigns to reduce the distribution of non-consensual and dehumanising material, the demand for this type of material remains extant. While the mainstream adult pornography industry remains the focus of interest, individuals have capitalised on using social media platforms to create content for financial gain. One of these prominent sites is OnlyFans, a ‘content subscription service’ that allows individuals to earn money from their subscribers. These subscription sites provide individuals the opportunity to request bespoke ‘content’ from creators.

In August 2021, OnlyFans announced it would be banning adult content, from October 2021 due to the pressure being placed on them from banking institutions, like that of Mastercard and Visa, again due to the concerns of non-consensual and dehumanising content being uploaded to the site; however, a week after the first announcement, OnlyFans reversed their decision after they managed to secure financial backing; however, this had already caused creators to search for alternative platforms.

Once sites gain unwanted media attention for content creators, they will search for alternatives. These alternatives provide the opportunities to continue without disruption until they are discovered and are themselves the subject of governments, law enforcement, non-governmental organisations, and academia’s attention.

The problem for governments, law enforcement, non-governmental organisations, and academia is that unless they conduct proactive scans of the environment, they will be ignorant to the evolution, which leads to the distribution of additional dehumanising or illegal material. In addition, potential victims and content creators are pressured into creating more extreme material due to the demand and the pressure of losing revenue.

As previously discussed, social media platforms are used to share content or, when it becomes more difficult, advertise other platforms that afford the ability to promote sexualised content. If governments, law enforcement, non-governmental organisations, and academia are unaware of how content creators, offenders, and potential victims develop, they will be inhibited in understanding and regulating the environment.

Without an understanding and practical regulation of the environment the demand for dehumanising material will continue to increase and the social normalisation of problematic and harmful acts will be compounded.