TheGridNet
The Oklahoma City Grid Oklahoma City

State senator pushes to redefine child pornography laws to include AI-generated images

A bill filed for the upcoming legislative session aims to redefine what's considered child porn.It would include AI generated media that could res State Senator Darrell Weaver is pushing for a bill to redefine child pornography laws to include AI-generated media that could resemble child-like images. The bill, filed for the upcoming legislative session, aims to clarify the definition of child pornography. The Internet Crimes Against Children Division reported over 9,000 cyber tips last year from the National Center of Missing and Exploited Children. Some reports of possible sexual abuse crimes are becoming harder to investigate due to AI content. One of the big questions is how can you prosecute someone for generating an image from a computer without identifying the victim.

State senator pushes to redefine child pornography laws to include AI-generated images

Publié : il y a 3 mois par https://www.facebook.com/NewsChannel8Tulsa/, Adam King & KOKH Staff dans Politics Tech

A bill filed for the upcoming legislative session by State Senator Darrell Weaver is aiming to redefine what's considered child porn.

It would include AI-generated media that could resemble child-like images.

"There is a difference between pornography and child abuse material. We need to be able to wrap our arms around that and be able to do something about that in a legal sense," says State Sen. Weaver.

According to Phillips, the Internet Crimes Against Children Division worked more than 9,000 cyber tips last year from the National Center of Missing and Exploited Children.

Some reports of possible sexual abuse crimes are getting harder to investigate because of artificial intelligence.

"That's what gets kind of confusing with the AI content. Is this a real child that we haven't identified yet or is this computer generated?" says Blaine Phillips, who works with the OSBI. "The technology is advancing so quickly it is difficult to determine if something is fake or not."

One of the big questions surrounding AI images is how can you prosecute someone for generating an image from a computer when there's potentially no victim.

"We'll get images of a child but we won't know who the victim is, but we can still prosecute that based on the determination that this is clearly a child even if we don't have that person identified," says Phillips.

Investigators say it's alarming how quickly something that was made for good can take a turn.

"They, I believe, were intended with positive uses and the speed at which they've become negative is very surprising. It seems as soon as the AI is out there, it's immediately snatched up to be used for nefarious reasons," says Rachel Savory with the OSBI.

Both Savory and Phillips tell Fox 25 that any legislation right now would be good, just to give prosecutors a guide on how to move forward in cases like this.

They believe it could also clarify some previous rulings to help prosecutors in the future.


Les sujets: AI, Sexual Exploitation

Read at original source