San Francisco goes after websites that make AI deepfake nudes of women and girls
Nearly a year after AI-generated nude images of high school girls upended a community in southern Spain, a juvenile court this summer sentenced 15 of their classmates to a year of probation.
But the artificial intelligence tool used to create the harmful deepfakes is still easily accessible on the internet, promising to “undress any photo” uploaded to the website within seconds.
Now a new effort to shut down the app and others like it is being pursued in California, where San Francisco this week filed a first-of-its-kind lawsuit that experts say could set a precedent but will also face many hurdles.
“The proliferation of these images has exploited a shocking number of women and girls across the globe,” said David Chiu, the elected city attorney of San Francisco who brought the case against a group of widely visited websites based in Estonia, Serbia, the United Kingdom and elsewhere.
“These images are used to bully, humiliate and threaten women and girls,” he said in an interview with The Associated Press. “And the impact on the victims has been devastating on their reputation, mental health, loss of autonomy, and in some instances, causing some to become suicidal.”
The lawsuit brought on behalf of the people of California alleges that the services broke numerous state laws against fraudulent business practices, nonconsensual pornography and the sexual abuse of children. But it can be hard to determine who runs the apps, which are unavailable in phone app stores but still easily found on the internet.
Contacted late last year by the AP, one service claimed by email that its “CEO is based and moves throughout the USA” but declined to provide any evidence or answer other questions. The AP is not naming the specific apps being sued in order to not promote them.
“There are a number of sites where we don’t know at this moment exactly who these operators are and where they’re operating from, but we have investigative tools and subpoena authority to dig into that,” Chiu said. “And we will certainly utilize our powers in the course of this litigation.”
Many of the tools are being used to create realistic fakes that “nudify” photos of clothed adult women, including celebrities, without their consent. But they’ve also popped up in schools around the world, from Australia to Beverly Hills in California, typically with boys creating the images of female classmates that then circulate widely through social media.
In one of the first widely publicized cases last September in Almendralejo, Spain, a physician whose daughter was among a group of girls victimized last year and helped bring it to the public’s attention said she’s satisfied by the severity of the sentence their classmates are facing after a court decision earlier this summer.
But it is “not only the responsibility of society, of education, of parents and schools, but also the responsibility of the digital giants that profit from all this garbage,” Dr. Miriam al Adib Mendiri said in an interview Friday.
She applauded San Francisco’s action but said more efforts are needed, including from bigger companies like California-based Meta Platforms and its subsidiary WhatsApp, which was used to circulate the images in Spain.
While schools and law enforcement agencies have sought to punish those who make and share the deepfakes, authorities have struggled with what to do about the tools themselves.
In January, the executive branch of the European Union explained in a letter to a Spanish member of the European Parliament that the app used in Almendralejo “does not appear” to fall under the bloc’s sweeping new rules for bolstering online safety because it’s not a big enough platform.
Organizations that have been tracking the growth of AI-generated child sexual abuse material will be closely following the San Francisco case.
The lawsuit “has the potential to set legal precedent in this area,” said Emily Slifer, the director of policy at Thorn, an organization that works to combat the sexual exploitation of children.
A researcher at Stanford University said that because so many of the defendants are based outside the U.S., it will be harder to bring them to justice.
Chiu “has an uphill battle with this case, but may be able to get some of the sites taken offline if the defendants running them ignore the lawsuit,” said Stanford’s Riana Pfefferkorn.
She said that could happen if the city wins by default in their absence and obtains orders affecting domain-name registrars, web hosts and payment processors “that would effectively shutter those sites even if their owners never appear in the litigation.”