Current:Home > MarketsFeds say Army soldier used AI to create child sex abuse images-LoTradeCoin

Feds say Army soldier used AI to create child sex abuse images

​​​​​​​View Date:2024-12-23 21:00:35

A U.S. Army soldier stationed in Alaska used artificial intelligence to generate child sexual abuse material in a criminal case that underscores the lengths that online predators will go to exploit children, federal prosecutors said this week.

Seth Herrera, 34, used AI chatbots to create pornography of minors whom he knew, the Justice Department said. He also viewed tens of thousands of images depicting violent sexual abuse of children, including infants, according to court records.

“Criminals considering the use of AI to perpetuate their crimes should stop and think twice − because the Department of Justice is prosecuting AI-enabled criminal conduct to the fullest extent of the law and will seek increased sentences wherever warranted,” said Deputy U.S. Attorney General Lisa Monaco.

The FBI issued a public service announcement earlier this year about child sexual abuse material, noting all such images and videos, including those created through AI, are illegal.

The arrest comes as federal officials warn about a rise in sex abuse content through AI, which allows offenders to create images and videos on an exponentially larger scale, according to the Department of Homeland Security. The technology poses new challenges to law enforcement targeting the content, but it may also serve as a tool to quickly and accurately identify offenders and victims, the DHS said.

Court papers detail child pornography chat groups

According to a memo in support of pre-trial detention filed in U.S. District Court for the District of Alaska, Herrera joined online messaging groups devoted to trafficking the abusive content. The soldier, stationed at Joint Base Elmendorf-Richardson in Anchorage, saved “surreptitious recordings” of minors undressing in his home and then used AI chatbots to generate exploitative content of them, according to federal court documents.

He also used images and videos of children posted to social media to create sexually abusive material, according to the memo.

Homeland Security Investigations agents executed a search warrant of Herrera’s home, where he lives with his wife and daughter, according to court records. Three Samsung Galaxy phones contained tens of thousands of videos and images that depicted rape and other sexual abuse of children as young as infants, the memo said, dating back to at least March 2021. Herrera stored the material in a password-protected app disguised as a calculator on his phone, prosecutors said.

Herrera also sought out sexually abusive content that depicted children roughly the age of his daughter, according to the memo, and six kids lived under the same roof as he did in the military base fourplex.

Court records say he admitted in an interview to viewing child sexual abuse content online for the past year and a half.

“Absolutely no child should suffer these travesties, and no person should feel immune from detection and prosecution for these crimes by HSI and its partners in law enforcement," said Katrina W. Berger, executive associate director of Homeland Security Investigations.

Herrera was arrested Friday and is charged with transportation, receipt and possession of child pornography. He faces a maximum penalty of 20 years in prison. His initial court appearance was expected Tuesday.

A public defender listed in court records for Herrera did not immediately return USA TODAY’s request for comment Monday.

Combating sexual predators in age of AI

The arrest is the latest to sweep the nation as federal law enforcement agents grapple with the use of new technology by sexual predators.

“Federal law prohibits the production, advertisement, transportation, distribution, receipt, sale, access with intent to view, and possession of any CSAM (child sexual abuse material), including realistic computer-generated images,” according to an FBI public service announcement.

Officials say they have also been able to use the new technology to catch offenders. In 2023, Homeland Security Investigations used machine learning models to identify 311 cases of online sexual exploitation. The three-week-long mission, dubbed Operation Renewed Hope, led to the identification or rescue of more than 100 victims and the arrests of several suspected offenders, the HSI said.

Suspected production of child sexual abuse content, including AI-generated material, can be reported to the National Center for Missing and Exploited Children by calling 800-THE LOST or online at www.cybertipline.org. It can also be reported to the FBI Internet Crime Complaint Center at www.ic3.gov.

veryGood! (5)

Tags