If the possession or consultation of child pornography images is punishable in many countries, what happens when the children represented in these images are not real people, but fictional characters created by generative artificial intelligence (AI) ? A British court has looked into the issue, and has for the first time sentenced a man to no longer use generative artificial intelligence tools without the authorization of a police officer, reports the Guardian.
Anthony Dover, a 48-year-old British man, received this injunction to no longer use, visit or access these digital tools, as a condition of a court order aimed at preventing child crime offenses. The individual was in fact found guilty of having created more than a thousand images featuring young children in sexualized postures or outfits, using generative AI.
Also read: Pornographic deepfakes targeting Taylor Swift raise awareness of the dangers of AI
The court in Poole (south of England), which banned him from using AI to generate child pornography images, also fined him £200. In the court’s sights, it is in particular the “Stable Diffusion” software which is targeted: this, unlike other algorithms developed by private companies, is open source, that is to say that its code is directly accessible to users. Even if the software provides restrictions on use, in particular to prevent uses contrary to certain laws on the prevention of deepfakes (“hyper-faking” of photos or videos) or precisely child pornography use, this software is regularly misused by individuals to create sexual content.
In addition to generating child pornography images, this software has also been used to create sexual deepfakes of real people – for example celebrities. The British government just announced last week that the creation of sexually explicit deepfakes, featuring the image of people who have not given their consent, would soon be prohibited by specific law. The penalties incurred could include prison.
In the United Kingdom, laws in force since the 1990s to prohibit the possession of child pornography images had already been used to convict people who had made photomontages using image editing software.
Several associations monitoring online crime have welcomed the decision of the Poole court, such as the Internet Watch Foundation (IWF), which applauded in a press release a “historic” decision, considering that generative AI software is “factories capable of producing the most terrible images”. The general director of this association, Susie Hargreaves, estimated that this type of diversion from the use of AI is “on the rise” and makes it possible to produce “increasingly realistic” images.