X (formerly Twitter) takes swift action when taking down deepfake nude images that are This Ain't Avatar XXX Porn Parodyreported as copyright violations — but not when they're reported under "nonconsensual nudity," a study has found.
The paper, published by researchers at the University of Michigan and Florida International University, is an audit of X's reporting systems and hasn't yet been peer-reviewed, 404 Media reported. Researchers created five AI "personas" of young white women (to prevent further variables of race, gender, and age) and then made 10 replica images of each, resulting in 50 images. In terms of the ethics around generating deepfake porn themselves, researchers said these images underwent a "rigorous verification process" to ensure they didn't represent an existing individual.
SEE ALSO: How Big Tech is approaching explicit, nonconsensual deepfakesThey posted these images to X on 10 "poster accounts" they created, and then they created five X accounts to report the images. Twenty-five images were reported as Digital Millennium Copyright Act (DMCA) violations, and the other 25 were reported as nonconsensual nudity.
Researchers then waited three weeks to see the results of these reports. All 25 images reported for copyright were removed from X within 25 hours. In contrast, none of the images reported for nonconsensual nudity were removed within the three-week waiting period.
"Our findings reveal a significant disparity in the effectiveness of content removal processes between reports made under the DMCA and those made under X's internal nonconsensual nudity policy," the study states. "This highlights the need for stronger and directed regulations and protocols to protect victim-survivors."
X owner Elon Musk dissolved the platform's trust and safety council in 2022, but the site has recently opened up two dozen safety and cybersecurity positions in the U.S. Mashable has reached out to X for comment.
Earlier this year, WIRED found that victims of nonconsensual deepfake porn leveraged copyright laws to take down deepfakes on Google.
If you have had intimate images shared without your consent, call the Cyber Civil Rights Initiative’s 24/7 hotline at 844-878-2274 for free, confidential support. The CCRI website also includes helpful informationas well as a list of international resources.
Topics Social Good X/Twitter
(Editor: {typename type="name"/})
The 10 Most Anticipated PC Games of 2017
Muggles can now apply their makeup with Harry Potter's wand
Walmart has the Switch 2 in stock for $749
A storm with the DNA of a super typhoon will slam Pacific Northwest this Saturday
Fyre Festival and Trump’s Language
Cozmo is an adorable robot companion that could rule the holidays
Donald Trump accuses Hillary Clinton supporters of firebombing North Carolina GOP office
Why Google is retooling search to put mobile first
'The Simpsons' celebrates 600th episode with virtual reality short
Exceptionally rare radio sources detected in the distant universe
Here's what happens when the iPhone 7's home button dies
接受PR>=1、BR>=1,流量相当,内容相关类链接。