学校深伪裸照危机比你想象的更严重。
The Deepfake Nudes Crisis in Schools Is Worse Than You Thought

原始链接: https://www.wired.com/story/deepfake-nudify-schools-global-crisis/

人工智能驱动的“深度伪造”技术正在助长有害性侵犯行为的激增,尤其针对青少年。虽然这些图像自2017年以来就已存在,但现在易于访问的应用程序允许任何人——通常是青少年男孩——轻松创建他人未经同意的性化图像和视频。 这对受害者的影响是毁灭性的,造成严重的痛苦、焦虑和对持续在线暴露的恐惧,甚至可能暴露给潜在的掠食者。许多受害者退出学校和社会生活。创作动机包括性满足、报复、羞辱和社会控制,这凸显了预先存在的有害性别动态。 全球学校正在通过限制学生在年鉴和在线平台上的拍照来减轻图像滥用的风险。专家强调,问题不仅仅是技术问题,而是更深层社会问题的反映,以及人工智能为有害行为提供的*规模、速度*和*可访问性*的提升。

## 深度伪造裸照与学校:日益严重的危机 一篇最近的《连线》文章(链接在Hacker News讨论中)强调了一个日益严重的问题:人工智能生成的裸照,通常以青少年女孩为目标。虽然数字篡改图像制作非自愿色情内容并非新事,但当前人工智能工具的可访问性和真实性大大降低了犯罪者的门槛。 核心问题是受害者所经历的羞辱和骚扰,即使这些图像被认为是假的。一些评论员淡化了这个问题,认为它与过去形式的图像操纵没有区别,或者归咎于社会“清教主义”。然而,许多人认为深度伪造的简易性和说服力大大加剧了危害。 讨论的重点是潜在的解决方案。建议包括限制人工智能工具、加强关于深度伪造的教育,以及执行现有的反骚扰和复仇色情法律。一个关键点是,虽然可以监管中心化平台,但本地运行的人工智能模型构成更大的挑战。最终,许多人认为技术解决方案是不够的,解决这个问题需要更广泛的社会转变,重点是防止这些图像的*分享*,而不仅仅是创作行为。
相关文章

原文

Nevertheless, there are clear patterns that appear. In nearly all cases, teenage boys are allegedly responsible for the creation of the images or videos. They are often shared in social media apps or via instant messaging with classmates. And they are hugely harmful to the victims. “I’m worried that every time they see me, they see those photos,” one victim in Iowa said earlier this year. “She's been crying. She hasn't been eating,” another’s family said.

In multiple instances, victims often do not want to attend school or be faced with seeing those who created explicit images or videos of them. “She feels hopeless because she knows that these images will likely make it onto the internet and reach pedophiles,” says lawyer Shane Vogt, and three Yale Law School students, Catharine Strong, Tony Sjodin, and Suzanne Castillo, who are representing one unnamed New Jersey teenager in legal action against a nudifying service. “She is severely distressed by the knowledge that these images are out there, and she will have to monitor the internet for the rest of her life to keep them from spreading.”

In South Korea and Australia, schools have given pupils the option not to have their photos in yearbooks or stopped posting images of students on their official social media accounts, citing their use for potential deepfake abuse. “Around the world, there have been cases where school images were taken from public social media pages, altered using AI, and turned into harmful deepfakes,” one school in Australia said. “Imagery will instead feature side profiles, silhouettes, backs of heads, distant group shots, creative filters, or approved stock photography.”

Sexual deepfakes created using AI have existed since around the end of 2017; however, as generative AI systems have emerged and become more powerful, they have led to a shadowy ecosystem of “nudification” or “undress” technologies. Dozens of apps, bots, and websites allow anyone to create sexualized images and videos of others with just a couple of clicks, often with no technical knowledge.

“What AI changes is scale, speed, and accessibility,” says Siddharth Pillai, cofounder and director of the RATI Foundation, a Mumbai-based organization working to prevent violence against women and children. “The technical barrier has dropped significantly, which means more people, including adolescents, can produce more convincing outputs with minimal effort. As with many AI-enabled harms, this results in a glut of content.”

Amanda Goharian, the director of research and insights at child safety group Thorn, says its research indicates that there are different motivations involved in teenagers creating deepfake abuse, ranging from sexual motivations, curiosity, revenge, or even teens daring each other to create the imagery. Studies involving adults who have created deepfake sexual abuse similarly show a host of different reasons why the images may be created. “The goal is not always sexual gratification,” Pillai says. “Increasingly, the intent is humiliation, denigration, and social control.”

“It’s not just about the tech,” says Tanya Horeck, a feminist media studies professor and researcher focusing on gender-based violence who has looked at sexualized deepfakes in UK schools at Anglia Ruskin University. “It's about the long-standing gender dynamics that facilitate these crimes.”

联系我们 contact @ memedata.com