The difference a single word in Utah law can make for victims of AI-generated pornography

0

As technology advances, adding a word to state law could be vital to protecting victims of photos and videos that have been altered to be sexually explicit.

Sen. Karen Cowan, D-Murray, introduced SB66A draft law would amend the definition of a fake intimate image by adding the word “generating.” At a public hearing on Wednesday, Cowan said the addition of the word was intended to “fill a potential loophole in all new AI technology.”

AI technology has advanced since two years ago when Cowan said she filed a bill on fake photos. “At the time, I called them deep fakers,” she said. “But now it’s also kind of like AI and ChatGPT.”

If the bill is approved, the definition of a false intimate image would be as follows: “Any visual depiction, photograph, film, video, recording, picture, picture or image generated by a computer or computer, whether or not Manufactured or produced by electronic, mechanical or other means.” , which has been edited, manipulated, created or induced to portray the likeness of an identifiable individual” in sexually explicit ways listed by the law.

The Senate Judiciary, Law Enforcement and Criminal Justice Committee unanimously recommended Coan’s bill on Wednesday.

The bill comes amid growing questions about whether state and federal laws take into account sexually explicit images generated by artificial intelligence.

New York case

One case occurred in New York City. A man named Patrick Curry was sentenced to six months in prison and 10 years of probation after pleading guilty to “multiple crimes in a deepfake scheme, including promoting the sexual performance of a child, aggravated harassment as a hate crime, and stalking.” NBC New York.

“Patrick Curry targeted these women, altering photos he took from their social media accounts and those of their family members and manipulating them using deep fake technology to create pornographic material that he posted online,” Nassau County District Attorney Anne Donnelly said. NBC New York.

Donnelly said NBC New York The clear photo of the minor discovered by investigators was “the only reason” for sentencing him to prison. “New York State currently lacks adequate criminal laws to protect victims of ‘deepfake’ pornography, both adults and children,” Donnelly said.

The path to legal recourse for victims of deep porn was something Tori Rosay, director of the Corporate Advocacy Program and an analyst at the National Center on Sexual Exploitation, said she had concerns about as well.

“First, we don’t have a federal law that addresses this problem,” Rossi told the Deseret News by phone. “We don’t even have a federal law that addresses graphic sexual assault in general or what some call revenge porn.”

While states have laws that address deepfakes and explicit photos generated by AI, “one problem is that they define deepfakes or digitally altered photos in very different contexts and most of them require showing intent to harm,” Rosie said. She explained that showing intent to harm with these images can be “very difficult,” because these images are often created anonymously.

Another problem, Rossi said, is how quickly technology has spread that can remove clothing from a person and create sexually explicit photos and videos. In the future, you hope that if a template or code is created with the goal of doing this kind of thing, there will be a policy in place that can remove the template or code from public interface and open source sites to limit its use.

Rossi also completed her master’s degree in HarvardI spoke with victims of deep porn who described how emotionally devastating they were when their photos were digitally altered to be sexually explicit.

“It’s also gender-based violence,” Rossi said. With these photos and videos, even if they’ve been removed from a website, she said it can be impossible to have peace of mind because they’re gone everywhere. “It was never over even after the photos were removed because you couldn’t confirm whether they were there or not,” she said, explaining that victims felt their suffering continued after the photos were removed in one place.

“The ability to harm individuals is going to grow faster than we can control through laws and policies, because the image can be altered quite a bit — the head can be moved into a pornographic form,” said Chris McKenna, founder of Protect Young Eyes. “The ability of artificial intelligence to alter images is vast.

“Our laws are already 10 steps behind, and the new thing tomorrow will make them 12 steps behind,” McKenna said in a phone interview about how the United States handles fake pornography. He explained that laws sometimes do not take into account the way artificial intelligence has developed so quickly. Many of the laws on the books go back decades, and it’s been difficult for state and federal law to keep up with the changes, but McKenna emphasized how important it is to implement changes before more damage spreads.

“Participants portrayed their experiences with image-based sexual assault and sexual deepfakes as irreparable damage,” Rossi wrote in her article. thesis. “While participants’ experiences were marked by differences in context, age, gender, and medium, complete and insurmountable devastation was experienced unanimously among all surviving victims.”

Rossi documented that one victim named Ella said: “Within one second of doing a reverse image search on Google, my laptop screen was filled with dozens of links to my photos on numerous porn sites across multiple pages of search results. My stomach sank to the floor. My heart was beating out of my chest. My eyes widened in shock and disbelief. I was terrified.

For many victims, the suffering does not stop at the images.

“While there is a specific incentive (creation) for future experiences of abuse, victim-survivors are unable to determine a definitive end to their abuse—because it does not exist,” Rosie wrote in her thesis. “In some cases, victim-survivors were inundated with intrusive comments and harassment in which they were blamed, shamed and held accountable by the online public, as well as their friends and family, for turning to pornography without their consent or knowledge for years after the initial event occurred.”

Leave A Reply

Your email address will not be published.