Upcoming Decision Poses Challenge for AI Image Applications

ST. PAUL, Minn. (AP) — In a surprising turn of events, Molly Kelley recently discovered that someone she knew had used commonly available “nudification” technology to produce realistic and sexually explicit videos and images of her, using family photos that had been shared on social media. “My initial shock quickly turned to horror when I found out that this individual had targeted approximately 80 to 85 other women, most of whom reside in Minnesota, some of whom I have personal connections with, and all of them had ties to the perpetrator,” expressed Kelley.

With Kelley’s testimony as a driving force, Minnesota is contemplating a fresh approach to combat deepfake pornography. A proposed bill, enjoying bipartisan support, aims to tackle companies operating websites and apps that permit users to upload a photo, subsequently transforming it into explicit content.

Various states and Congress are in the process of formulating strategies to regulate artificial intelligence. Many have already prohibited the dissemination of sexually explicit deepfakes or revenge porn, regardless of whether AI was utilized in their creation. The goal behind the legislation in Minnesota is to preempt the generation of such content before it circulates online.

Legal experts specializing in AI law raise concerns over the potential unconstitutionality of the proposal on grounds of free speech.

Reasons Advocating for the Necessity of the Bill:
State Senator Erin Maye Quade, the lead proponent of the bill, asserts that additional constraints are imperative due to the rapid advancement of AI technology. Her bill mandates that operators of “nudification” platforms in Minnesota shut down access to residents of the state or risk facing civil penalties of up to $500,000 for each unauthorized interaction. Developers would need to implement mechanisms to disable the feature for users in Minnesota.

According to Maye Quade, the harm inflicted on victims goes beyond mere dissemination; it’s the mere existence of such content that is distressing.

Kelley emphasized to the press last month how effortlessly individuals could create “hyper-realistic nude images or pornographic videos” within minutes.

To date, the majority of law enforcement efforts have concentrated on addressing distribution and possession of such content.

Additional Measures Being Pursued by Congress, States, and Cities:
In August, San Francisco initiated a pioneering lawsuit against several prominent “nudification” websites, alleging violations of state laws prohibiting fraudulent business practices, nonconsensual pornography, and exploitation of minors. The lawsuit is ongoing.

The U.S. Senate recently passed a bill unanimously, put forth by Senators Amy Klobuchar (D-MN) and Ted Cruz (R-TX), criminalizing the publication of nonconsensual sexual content, including AI-generated deepfakes. Social media platforms would be mandated to remove such content within 48 hours of notification from a victim. First Lady Melania Trump recently advocated for the bill’s endorsement in the Republican-held House, where it awaits approval.

The Kansas House endorsed a bill last month expanding the

The states of Dakota, Oregon, Rhode Island, South Carolina, and Texas have been identified in an analysis by the Associated Press, which utilized the bill-tracking software Plural.Maye Quade expressed her intention to present her proposal to lawmakers in various other states, noting that many individuals are unaware of the accessibility of such technology. She emphasized the importance of taking action at the state level if Congress proves unresponsive.

In a poignant display, victims have come forward to share their experiences. Sandi Johnson, the senior legislative policy counsel for RAINN (Rape, Abuse, and Incest National Network), underscored the significance of the Minnesota bill in holding websites accountable for their actions. Testifying recently, she highlighted the distressing reality that once intimate images are created, they can be disseminated widely and anonymously, making their removal nearly impossible.

Megan Hurley recounted her shock upon discovering explicit images and videos of herself generated using a “nudification” platform. As a massage therapist, she expressed feeling particularly humiliated, given the sexualized perceptions associated with her profession. Hurley condemned the ease with which individuals can manipulate technology to create synthetic and invasive imagery, targeting not just her but also her loved ones.

Despite the noble intentions behind the Minnesota bill, caution has been advised by two AI law experts, Wayne Unger from Quinnipiac University School of Law, and Riana Pfefferkorn from Stanford University’s Institute for Human-Centered Artificial Intelligence. They raised concerns about the bill’s broad scope potentially facing legal challenges. Pfefferkorn suggested that narrowing the focus to real children might enhance its legal standing, though conflicts with existing federal laws remain a possibility.

Maye Quade defended the constitutionality of her legislation, asserting that it aims to regulate conduct rather than impede free speech. She called for accountability from tech companies, emphasizing the necessity of consequences for the harmful technologies they propagate.

The narrative was contributed to by Associated Press journalists Matt O’Brien, John Hanna, and Kate Payne, reporting from Providence, Rhode Island; Wichita, Kansas; and Tallahassee, Florida, respectively.

Correction: The spelling of Molly Kelley’s last name has been revised to Kelley.

Author

Recommended news

Major Drone Attack Hits Ukraine in Ongoing Conflict

Officials have revealed that Russia launched a significant drone attack on Ukraine over the weekend, marking the largest assault...
- Advertisement -spot_img