Skip to content

Campaigning Focus for UK General Election: Advocating for Regulation of Nudity in Apps and Prevention of Image-Based Harassment

Applications known as Nudify apps enable users to modify photos and create the illusion of nudity or semi-nudity. In light of an upcoming general election, the misuse of these apps has emerged as a key concern for younger voters.

Discussing the Key Feminist Concern in the UK General Election: Proposing the Development of...
Discussing the Key Feminist Concern in the UK General Election: Proposing the Development of 'Nudify' Apps for Addressing Image-Based Abuse

Campaigning Focus for UK General Election: Advocating for Regulation of Nudity in Apps and Prevention of Image-Based Harassment

UK Cracks Down on AI-Generated Non-Consensual Sexual Imagery

In a bid to protect its citizens from the harmful effects of artificial intelligence (AI), the UK government is taking significant steps to regulate the creation and distribution of AI-generated non-consensual sexual imagery.

Currently, under the Data (Uses and Access) Act 2025 and the Online Safety Act 2023, creating sexually explicit images of another person without their consent, including AI-generated or deepfake images, is a criminal offense punishable by unlimited fines and possible listing on the Sex Offenders Registry. Sharing such images can result in up to two years imprisonment.

However, the UK government is not resting on its laurels. With the Crime and Sentencing Bill 2025 and other proposals, the government is advancing new laws specifically aimed at banning AI-generated child sexual abuse material and AI tools used to create such images. The proposed penalties include up to five years imprisonment for creating, possessing, or distributing AI-generated indecent images of children or AI software designed for this purpose.

Additionally, the government is proposing to criminalize websites facilitating sharing or grooming using such content, with penalties up to 10 years imprisonment. Border officers will also be empowered to inspect digital devices suspected of containing child sexual abuse material.

UK online platforms also face strict requirements under the Online Safety Act 2023 to proactively remove non-consensual sexual images, including deepfakes, or face fines up to 10% of global revenue. Meta (Facebook) has taken action against “nudify” apps, which generate fake explicit images non-consensually.

Despite these stringent measures, some experts have pointed out a gap in the current laws, as the UK does not criminalize the mere creation of sexual deepfakes that are never shared. The government is actively seeking to close such gaps through legislative changes.

The issue of AI-generated nudes has been an issue demanding legislative attention, with celebrities being targeted online in the past. Recently, underage deepfake porn of Jenna Ortega and Sabrina Carpenter was used in Instagram and Facebook ads. This incident underscores the need for more stringent regulation.

The rapid acceleration of AI capabilities presents new dangers and the need for more sophisticated regulation. Governments are confronted with a multitude of new challenges, and the UK is no exception. The UK government needs to specifically target image-based abuse and place limitations on the permitted features for AI apps as well as appropriate AI advertisements on social media.

The amendment to the Criminal Justice Bill focuses on intent to cause harm, instead of simple consent. This amendment is a step towards addressing the issue of AI-generated nudes more effectively. Following Maiberg's report, Apple and Google pulled multiple nonconsensual AI nude apps from their respective app stores.

As the general election approaches, the issue of online safety, particularly women's rights, is a major concern for young voters. A 550 per cent surge in deepfake videos online in 2023, according to research from Home Security Heroes, underscores the urgency of the situation.

In March 2024, it was discovered that Instagram and Facebook were distributing ads for Perky AI, a nudify application. This incident highlights the need for more stringent regulation of AI-generated content on social media platforms.

While the UK currently has some of the strictest laws globally against AI-generated non-consensual sexual imagery, the government is actively seeking to close gaps and strengthen its legal framework to protect its citizens from this harmful practice.

Read also:

Latest