The children’s commissioner for England is calling for “nudification” apps to be banned to prevent them generating sexual imagery of children. But what are they and would a ban work?
What are ‘nudification’ apps?
Advances in artificial intelligence software have paved the way for the emergence of “nudification” tools, which are becoming easier to find on social media or search engines.
These are apps and websites that produce deepfake nude images of real people using generative AI. This can involve removing clothes, getting an image to move suggestively, or pasting a head on to a naked body. The results often look highly realistic.
AI tools create images by learning from, and replicating elements of, existing images. Nudification apps are thought to be trained on vast datasets of mostly female images because they tend to work most effectively on women’s bodies. As a result, 99% of sexually explicit deepfakes accessible online are estimated to be of women and girls.
Although it is illegal to possess AI-generated sexual content featuring children, the AI models that create these images are not illegal.
How would a ban work?
The children’s commissioner is asking the government to legislate to ban AI tools that are designed or marketed as “nudification” services.
This could be achieved through a number of legal mechanisms.
For example, one option would be through an amendment to the product safety and metrology bill that would ensure that providers of generative AI tools are required to carry out risk assessments for illegal and harmful activity, and to take reasonable steps to design that risk out of the product.
This would mean that tools developed using generative AI models have to be risk-assessed for illegal and harmful activity before they are available in the UK. Nudification apps in their current form are not safe against illegal activity and will not be certified to be available in the UK.
The second option is for the government to introduce an AI bill in this parliamentary session which would make it the responsibility of providers of generative AI models to prevent their use for nudifying children.
Technology companies could be legally required to test their products against whether they can be used to nudify children before launching them in the UK market, and held to account if their models are used for this purpose.
However, critics of a ban may challenge it on internet freedom grounds, said Danielle Reece-Greenhalgh, a partner at the London law firm Corker Binning. She said it could be difficult to enforce as AI models improve, making it even harder to distinguish between “real” and “AI” created material.
What powers does the Online Safety Act contain?
The children’s commissioner does not believe that the Online Safety Act contains the provisions required to fully protect children from harm.
However, she notes that in the meantime, the risk can be partially mitigated through Ofcom’s implementation of the act.
As providers of sexually explicit or pornographic material, nudification services fall under its scope. They are required to verify that users are aged over 18 before allowing them access to content. However, this would not stop adults from making images of children.
Ofcom could also strengthen its provisions to protect children from harm by ensuring it is proactive in identifying emerging harms.
Social media companies are required to carry out risk assessments to comply with the Online Safety Act. This should require them to identify and mitigate the risk to children of content produced by sexually explicit deepfake tools, including content used to promote them.
The report also asks the government to provide more support for children to report an intimate image – including false ones that have been created using AI – that has been shared in a public online space, and to get it removed. It could ask Ofcom to require technology companies to embed “report remove” tools.
The report also suggests that sexually explicit deepfake technology be included on PSHE (personal, social, health and economic) curriculums.