X (formerly Twitter) takes swift action when taking down deepfake nude images that are Vernost aka Fidelity (2019)reported as copyright violations — but not when they're reported under "nonconsensual nudity," a study has found.
The paper, published by researchers at the University of Michigan and Florida International University, is an audit of X's reporting systems and hasn't yet been peer-reviewed, 404 Media reported. Researchers created five AI "personas" of young white women (to prevent further variables of race, gender, and age) and then made 10 replica images of each, resulting in 50 images. In terms of the ethics around generating deepfake porn themselves, researchers said these images underwent a "rigorous verification process" to ensure they didn't represent an existing individual.
SEE ALSO: How Big Tech is approaching explicit, nonconsensual deepfakesThey posted these images to X on 10 "poster accounts" they created, and then they created five X accounts to report the images. Twenty-five images were reported as Digital Millennium Copyright Act (DMCA) violations, and the other 25 were reported as nonconsensual nudity.
Researchers then waited three weeks to see the results of these reports. All 25 images reported for copyright were removed from X within 25 hours. In contrast, none of the images reported for nonconsensual nudity were removed within the three-week waiting period.
"Our findings reveal a significant disparity in the effectiveness of content removal processes between reports made under the DMCA and those made under X's internal nonconsensual nudity policy," the study states. "This highlights the need for stronger and directed regulations and protocols to protect victim-survivors."
X owner Elon Musk dissolved the platform's trust and safety council in 2022, but the site has recently opened up two dozen safety and cybersecurity positions in the U.S. Mashable has reached out to X for comment.
Earlier this year, WIRED found that victims of nonconsensual deepfake porn leveraged copyright laws to take down deepfakes on Google.
If you have had intimate images shared without your consent, call the Cyber Civil Rights Initiative’s 24/7 hotline at 844-878-2274 for free, confidential support. The CCRI website also includes helpful informationas well as a list of international resources.
Topics Social Good X/Twitter
(Editor: {typename type="name"/})
NYT Strands hints, answers for November 16
NYT Strands hints, answers for November 15
Oregon vs. Wisconson football livestreams: kickoff time, streaming deals, and more
Black Friday Best Buy Doorbuster deal: Get the LG OLED B4 48
Arkadium mini crossword answers for November 15
Ecuador vs. Bolivia 2024 livestream: Watch World Cup Qualifiers for free
The Norwegians Who Mistook Their Bus Seats for Muslims
New Orleans Pelicans vs. Los Angeles Lakers 2024 livestream: Watch NBA for free
Faster, Higher, Stronger, More Harmonious
Garmin smartwatch deals: Save up to 40% on the Venu 3, Vivoactive 5, Lily 2, and more
接受PR>=1、BR>=1,流量相当,内容相关类链接。