Following the viral spread of pornographic AI images of singer Taylor Swift,Korean College Girl Room Salon government leaders are addressing the creation and sharing of sexualized AI-generated deepfakes.
On Jan. 30, a bipartisan group of Senators introduced a new bill that would criminalize the act of spreading nonconsensual and sexualized “digital forgeries” created using artificial intelligence. Digital forgeries are defined as "a visual depiction created through the use of software, machine learning, artificial intelligence, or any other computer-generated or technological means to falsely appear to be authentic."
Currently known as the Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024 (or the "Defiance Act"), the legislation would also provide a path to civil recourse for victims who had their images depicted in nude or sexually explicit images. Through the bill, victims could sue "individuals who produced or possessed the forgery with intent to distribute it" or anyone who received the material knowing it was not made with consent, the Guardianreported.
"The volume of deepfake content available online is increasing exponentially as the technology used to create it has become more accessible to the public," wrote judiciary chair Richard J. Durbin and Rep. Lindsey Graham. "The overwhelming majority of this material is sexually explicit and is produced without the consent of the person depicted. A 2019 study found that 96 percent of deepfake videos were nonconsensual pornography."
Senate majority whip Dick Durbin explained in a press release that the bill's quick introduction was spurred explicitly by the viral images of Swift, and the White House's demand for accountability. “This month, fake, sexually explicit images of Taylor Swift that were generated by artificial intelligence swept across social media platforms. Although the imagery may be fake, the harm to the victims from the distribution of sexually explicit 'deepfakes' is very real."
This Tweet is currently unavailable. It might be loading or has been removed.
This Tweet is currently unavailable. It might be loading or has been removed.
This Tweet is currently unavailable. It might be loading or has been removed.
In a statement from White House press secretary Karine Jean-Pierre on Jan. 26, the Biden administration expressed its desire for Congress to address deepfake proliferation amid weak enforcement from social media platforms. “We know that lax enforcement disproportionately impacts women and also girls, sadly, who are the overwhelming targets of online harassment and also abuse," Jean-Pierre told reporters. "Congress should take legislative action."
The creation of deepfake "porn" has been criminalized in other countries and some U.S. states, although widespread adoption is yet to be seen. Mashable's Meera Navlakha has reported on a worsening social media landscape that's disregarded advocates' ongoing demands for protection and accountability, writing, "The alarming reality is that AI-generated images are becoming more pervasive, and presenting new dangers to those they depict. Exacerbating this issue is murky legal ground, social media platforms that have failed to foster effective safeguards, and the ongoing rise of artificial intelligence."
A climate of worsening media literacy — and the steep rise of digital misinformation and deepfake scams — prompts an even greater need for industry action.
If you have had intimate images shared without your consent, call the Cyber Civil Rights Initiative’s 24/7 hotline at 844-878-2274 for free, confidential support. The CCRI website also includes helpful informationas well as a list of international resources.
Topics Artificial Intelligence Social Good
Previous:The Precocious Socialist
Next:Man Out of Time
Taking Back PowerRotten Goes RancidPuerto Rico Was ReadyEmpire of MeatNYT mini crossword answers for May 25, 2025Get Out of the VanSCUM RisingNASA astronauts on Artemis could talk to a spaceship computerHard Time for the HardcoreProgressive PosturingWe Can’t Go on like ThisApply Directly to the ForeheadNo Pride in Police, No Police in PrideSan Francisco, Please Stand UpLet Us Compare MythologiesHappy as CoryI Feel Better NowA Fan’s NotesAndrew Yang’s War on Normal PeopleFirst as Tragedy, Then as Fascism A resourceful student turned to Tinder for help studying for her midterm How to undo send iMessages in iOS 16 Lurking John King is the best part of CNN's election coverage Twitter sues Elon Musk for backing out of takeover deal The biggest lessons I learned while starting a travel writing business How to get a secret phone number to protect your digital privacy Jenna Fischer shares the 2 times she laughed the hardest while filming 'The Office' We regret to inform you Trumpy Bear is a *real* thing you can spend money on How to mark an iMessage as unread in iOS 16 WhatsApp rolls out emoji reactions How to have period sex Wordle today: Here's the July 12 Wordle answer and hints CNN’s Jim Acosta slams reason for White House press pass suspension: 'This is a lie' Elon Musk doesn't want to buy Twitter anymore. The courts may give him no choice. How to edit an iMessage in iOS 16 Please enjoy a photo of Emmanuel Macron crushing Donald Trump's hand Wordle today: Here's the July 14 Wordle answer and hints We're one AI and brainwave experiment away from x Here's why Netflix's 'Man vs Bee' is a TV series and not a film Trump slams 'stupid question' from female reporter, calls April Ryan a 'loser'
1.9194s , 8222.9921875 kb
Copyright © 2025 Powered by 【Korean College Girl Room Salon】,Exquisite Information Network