Skip to content
  • Facebook
  • X
  • Linkedin
  • WhatsApp
  • Associate Journalism
  • About Us
  • Privacy Policy
  • 033-46046046
  • editor@artifex.news
Artifex.News

Artifex.News

Stay Connected. Stay Informed.

  • Breaking News
  • World
  • Nation
  • Sports
  • Business
  • Science
  • Entertainment
  • Lifestyle
  • Toggle search form
  • Access Denied World
  • Devendra Fadnavis’ Wife Amruta On Big Maharashtra Win
    Devendra Fadnavis’ Wife Amruta On Big Maharashtra Win Nation
  • India Hits Out At Pak
    India Hits Out At Pak World
  • 5 People From Jharkhand’s Jamtara Convicted For Money Laundering
    5 People From Jharkhand’s Jamtara Convicted For Money Laundering Nation
  • IPL-17 | Ruturaj Gaikwad knows when to attack and when to soak up pressure: Hussey
    IPL-17 | Ruturaj Gaikwad knows when to attack and when to soak up pressure: Hussey Sports
  • Making sense of the Israel-Hezbollah ceasefire
    Making sense of the Israel-Hezbollah ceasefire World
  • Switzerland Goalkeeper Yann Sommer Retires From International Football
    Switzerland Goalkeeper Yann Sommer Retires From International Football Sports
  • Access Denied
    Access Denied Nation

Amazon Found High Volume Of Child Sex Abuse Material In AI Training Data

Posted on January 29, 2026 By admin


Amazon.com Inc. reported hundreds of thousands of pieces of content last year that it believed included child sexual abuse, which it found in data gathered to improve its artificial intelligence models. Though Amazon removed the content before training its models, child safety officials said the company has not provided information about its source, potentially hindering law enforcement from finding perpetrators and protecting victims. 

Throughout last year, Amazon detected the material in its AI training data and reported it to the National Center for Missing and Exploited Children, or NCMEC. The organization, which was established by Congress to field tips about child sexual abuse and share them with law enforcement, recently started tracking the number of reports specifically tied to AI products and their development. In 2025, NCMEC saw at least a fifteen-fold increase in these AI-related reports, with “the vast majority” coming from Amazon. The findings haven’t been previously reported.

An Amazon spokesperson said the training data was obtained from external sources, and the company doesn’t have the details about its origin that could aid investigators. It’s common for companies to use data scraped from publicly available sources, such as the open web, to train their AI models. Other large tech companies have also scanned their training data and reported potentially exploitative material to NCMEC. However, the clearinghouse pointed to “glaring differences” between Amazon and its peers. The other companies collectively made just “a handful of reports,” and provided more detail on the origin of the material, a top NCMEC official said.

In an emailed statement, the Amazon spokesperson said that the company is committed to preventing child sexual abuse material across all of its businesses. “We take a deliberately cautious approach to scanning foundation model training data, including data from the public web, to identify and remove known [child sexual abuse material] and protect our customers,” the spokesperson said. 

The spike in Amazon’s reports coincides with a fast moving AI race that has left companies large and small scrambling to acquire and ingest huge volumes of data to improve their models. But that race has also complicated the work of child safety officials — who are struggling to keep up with the changing technology — and challenged regulators tasked with safeguarding AI from abuse. AI safety experts warn that quickly amassing large datasets without proper safeguards comes with grave risks. 

Amazon accounted for most of the more than 1 million AI-related reports of child sexual abuse material submitted to NCMEC in 2025, the organization said. It marks a jump from the 67,000 AI-related reports that came from across the tech and media industry a year prior, and just 4,700 in 2023. This category of AI-related reports can include AI-generated photos and videos, or sexually explicit conversations with AI chatbots. It can also include photos of real victims of sexual abuse that were collected, even unintentionally, in an effort to improve AI models. 

Training AI on illegal and exploitative content raises newfound concerns. It could risk shaping a model’s underlying behaviors, potentially improving its ability to digitally alter and sexualize photos of real children or create entirely new images of sexualized children that never existed. It also raises the threat of continuing the circulation of the images that models were trained on — re-victimizing children who have suffered abuse. 

The Amazon spokesperson said that, as of January, the company is “not aware of any instances” of its models generating child sexual abuse material.  None of its reports submitted to NCMEC were of AI-generated material, the spokesperson added. Instead, the content was flagged by an automatic detection tool that compared it against a database of known child abuse material involving real victims, a process called “hashing.” Approximately 99.97% of the reports resulted from scanning “non-proprietary training data,” the spokesperson said. 

Amazon believes it over-reported these cases to NCMEC to avoid accidentally missing something. “We intentionally use an over-inclusive threshold for scanning, which yields a high percentage of false positives,” the spokesperson added.

Amazon has more than 900 data center facilities worldwide.

Amazon has more than 900 data center facilities worldwide.
Photo Credit: Bloomberg

The AI-related reports received last year are just a fraction of the total number submitted to NCMEC. The larger category of reports also includes suspected child sexual abuse material sent in private messages or uploaded to social media feeds and the cloud. In 2024, for example, NCMEC received more than 20 million reports from across industry, with most coming from Meta Platforms Inc. subsidiaries Facebook, Instagram and WhatsApp. Not all reports are ultimately confirmed as containing child sexual abuse material, referred to with the acronym CSAM.

Still, the volume of suspected CSAM that Amazon detected across its AI pipeline in 2025 stunned child safety experts interviewed by Bloomberg News. The hundreds of thousands of reports made to NCMEC marked a drastic surge for the company. In 2024, Amazon and all of its subsidiaries made a total of 64,195 reports. 

“This is really an outlier,” said Fallon McNulty, the executive director of NCMEC’s CyberTipline, the entity to which US-based social media platforms, cloud providers and other companies are legally required to report suspected CSAM. “Having such a high volume come in throughout the year begs a lot of questions about where the data is coming from, and what safeguards have been put in place.” 

McNulty, speaking in an interview, said she has little visibility into what’s driving the surge of sexually exploitative material in Amazon’s initial training data sets. Amazon has provided “very little to almost no information” in their reports about where the illicit material originally came from, who had shared it, or if it remains actively available on the internet, she said. 

While Amazon is not required to share this level of detail, the lack of information makes it impossible for NCMEC to track down the material’s origin and work to get it removed, McNulty said. It also limits relevant law enforcement agencies tasked with searching for sex offenders and children in active danger. “There’s nothing then that can be done with those reports,” she said. “Our team has been really clear with [Amazon] that those reports are inactionable.”

When asked why the company didn’t disclose information about the possible origin of the material, or other key details, the Amazon spokesperson replied, “because of how this data is sourced, we don’t have the data that comprises an actionable report.” The spokesperson did not explain how the third-party data was sourced or why the company did not have sufficient information to create actionable reports. “While our proactive safeguards cannot provide the same detail in NCMEC reports as consumer-facing tools, we stand by our commitment to responsible AI and will continue our work to prevent CSAM,” the spokesperson said.

NCMEC, a nonprofit, receives funding both from the US government and private industry. Amazon is among its funders and holds a corporate seat on its board. 

“There should be more transparency on how companies are gathering and analyzing the data to train their models — and how they’re training them,” said David Thiel, the former chief technologist at the Stanford Internet Observatory, who has researched the prevalence of child sexual abuse material in AI training data. 

Such data can be licensed, purchased or scraped from the internet, or could be so-called synthetic data, which is text or images created by other AI tools. As AI companies seek to release new models quickly, “the rapid gathering of data is a much higher priority than doing safety analyses,” Thiel said. He warned that there are “always some errors” when it comes to sifting out CSAM from training data, and believes the industry needs to be more open about where its data is coming from. 

Amazon’s Bedrock offering, which gives customers access to various AI models so they can build their own AI products, includes automated detection for known CSAM and rejects and reports positive matches. The company’s consumer-facing generative AI products also allow users to report content that escapes its controls. 

The Seattle-based tech giant scans for CSAM across its other businesses, too, including its consumer photo storage service. Amazon’s cloud computing division, Amazon Web Services, also removes CSAM when it’s discovered on the web services it hosts. McNulty said AWS submitted far fewer reports than came from Amazon’s AI efforts. Amazon declined to break out specific reporting data across its various business units, but noted it would share broad data in March.

Only recently have technology companies really begun to scrutinize their AI models and training data for CSAM, said David Rust-Smith, a data scientist at Thorn, a nonprofit organization that provides tools to companies, including Amazon, to detect the exploitative material. 

“There’s definitely been a big shift in the last year of people coming to us asking for help cleaning data sets,” said Rust-Smith. He noted that “some of the biggest players” have sought to apply Thorn’s detection tools to their training data, but declined to speak about any individual company. Amazon did not use Thorn’s technology to scan its training data, the spokesperson confirmed.

Rust-Smith said AI-focused companies are approaching Thorn with a newfound urgency. “People are learning what we already knew, which is, if you hoover up a ton of the internet, you’re going to get [child sexual abuse material],” he said. 

Amazon was not the only company to spot and report potential CSAM from its AI workflows last year. Alphabet Inc.’s Google and OpenAI told Bloomberg News that they scan AI training data for exploitative material — a process that has surfaced potential CSAM, which the  companies then reported to NCMEC. Meta and Anthropic PBC said they, too, search training data for CSAM. Meta did not comment on whether it had identified the material, but said it would report to NCMEC if it did. Anthropic said it has not reported such material out of its training data. Meta and Google said that they’ve taken efforts to ensure that reports related to their AI workflows are distinguishable from those generated by others parts of their business. 

McNulty said that, with the exception of Amazon, the AI-related reports it received last year came in “really, really small volumes,” and included key details that allowed the clearinghouse to pass on actionable information to law enforcement. 

“Simply flagging that you came across something but not providing any type of actionable detail doesn’t help the larger child safety space,” McNulty said.




Source link

Business Tags:Amazon, child abuse

Post navigation

Previous Post: Profit, Revenue Beat Estimates Amid Solid Volumes Growth
Next Post: Trump, Democrats Close To Immigration Deal To Avert Shutdown

Related Posts

  • Silver soars ₹9,350 to record ₹2.36 lakh/kg in Delhi; crosses /ounce mark in international markets
    Silver soars ₹9,350 to record ₹2.36 lakh/kg in Delhi; crosses $75/ounce mark in international markets Business
  • Akum Pharma’s Finance chief Rajkumar Bafna resigns citing pollution in Delhi
    Akum Pharma’s Finance chief Rajkumar Bafna resigns citing pollution in Delhi Business
  • RBI Governor Shaktikanta Das rules out review of action against Paytm Payments Bank
    RBI Governor Shaktikanta Das rules out review of action against Paytm Payments Bank Business
  • PMI Electro Mobility raises ₹250 crore equity via preferential allotment 
    PMI Electro Mobility raises ₹250 crore equity via preferential allotment  Business
  • Reliance, Wipro weigh on Indian shares amid volatility spike ahead of Budget
    Reliance, Wipro weigh on Indian shares amid volatility spike ahead of Budget Business
  • Corporate tax revenue foregone at ₹99,000 crore in FY24: MoS Finance
    Corporate tax revenue foregone at ₹99,000 crore in FY24: MoS Finance Business

More Related Articles

Budget 2023 | Revamped credit guarantee scheme for MSMEs to be implemented from April 1 Budget 2023 | Revamped credit guarantee scheme for MSMEs to be implemented from April 1 Business
Macawber Beekay to establish three Green Coal projects for NTPC Macawber Beekay to establish three Green Coal projects for NTPC Business
IIHL to pay Reliance Capital’s lenders only after IRDAI approval: Ashok Hinduja IIHL to pay Reliance Capital’s lenders only after IRDAI approval: Ashok Hinduja Business
Rupee rises 10 paise to 83.80 against U.S. dollar in early trade Rupee rises 10 paise to 83.80 against U.S. dollar in early trade Business
Ola Electric gets Sebi’s nod for IPO; aims to raise ₹5,500 cr via fresh issue Ola Electric gets Sebi’s nod for IPO; aims to raise ₹5,500 cr via fresh issue Business
Sensex hits 85,000 level for first time before closing flat; FMCG, banking shares drag Sensex hits 85,000 level for first time before closing flat; FMCG, banking shares drag Business
SiteLock

Archives

  • January 2026
  • December 2025
  • November 2025
  • October 2025
  • September 2025
  • August 2025
  • July 2025
  • June 2025
  • May 2025
  • April 2025
  • March 2025
  • February 2025
  • January 2025
  • December 2024
  • November 2024
  • October 2024
  • September 2024
  • August 2024
  • July 2024
  • June 2024
  • May 2024
  • April 2024
  • March 2024
  • February 2024
  • January 2024
  • December 2023
  • November 2023
  • October 2023
  • September 2023
  • August 2023
  • July 2023
  • June 2023
  • May 2023
  • April 2023
  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022

Categories

  • Business
  • Nation
  • Science
  • Sports
  • World

Recent Posts

  • Trump says Iran wants deal, U.S. ‘armada’ larger than in Venezuela raid
  • Access Denied
  • Sun Pharma, CDSL, IDFC First Bank, Delhivery — Check Estimates
  • The 27th amendment, Pakistan’s democratic dilemma
  • Access Denied

Recent Comments

  1. GeorgeBline on UP Teacher Who Asked Students To Slap Muslim Classmate
  2. MichaelRob on UP Teacher Who Asked Students To Slap Muslim Classmate
  3. Elvinweith on UP Teacher Who Asked Students To Slap Muslim Classmate
  4. MichaelWhath on UP Teacher Who Asked Students To Slap Muslim Classmate
  5. MichaelWhath on UP Teacher Who Asked Students To Slap Muslim Classmate
  • Those Who Give Money To Beggars Will Now Face Police Case In This City
    Those Who Give Money To Beggars Will Now Face Police Case In This City Nation
  • Chad’s cotton farmers burned by climate change and false promises
    Chad’s cotton farmers burned by climate change and false promises World
  • Zelenskyy to seek more war support from a dozen countries in southeast Europe
    Zelenskyy to seek more war support from a dozen countries in southeast Europe World
  • Denmark proposes NATO surveillance mission for Greenland
    Denmark proposes NATO surveillance mission for Greenland World
  • EPFO rules change: Opposition claims salaried people being punished for government’s ‘mishandling of economy’
    EPFO rules change: Opposition claims salaried people being punished for government’s ‘mishandling of economy’ Business
  • All About Jill Stein, Third Candidate In The US Presidential Race
    All About Jill Stein, Third Candidate In The US Presidential Race World
  • Budget is by the people, for the people, says Nirmala Sitharaman
    Budget is by the people, for the people, says Nirmala Sitharaman Business
  • Asteroid named after Bengaluru Professor
    Asteroid named after Bengaluru Professor Science

Editor-in-Chief:
Mohammad Ariff,
MSW, MAJMC, BSW, DTL, CTS, CNM, CCR, CAL, RSL, ASOC.
editor@artifex.news

Associate Editors:
1. Zenellis R. Tuba,
zenelis@artifex.news
2. Haris Daniyel
daniyel@artifex.news

Photograher:
Rohan Das
rohan@artifex.news

Artifex.News offers Online Paid Internships to college students from India and Abroad. Interns will get a PRESS CARD and other online offers.
Send your CV (Subjectline: Paid Internship) to internship@artifex.news

Links:
Associate Journalism
About Us
Privacy Policy

News Links:
Breaking News
World
Nation
Sports
Business
Entertainment
Lifestyle

Registered Office:
72/A, Elliot Road, Kolkata - 700016
Tel: 033-22277777, 033-22172217
Email: office@artifex.news

Editorial Office / News Desk:
No. 13, Mezzanine Floor, Esplanade Metro Rail Station,
12 J. L. Nehru Road, Kolkata - 700069.
(Entry from Gate No. 5)
Tel: 033-46011099, 033-46046046
Email: editor@artifex.news

Copyright © 2023 Artifex.News Newsportal designed by Artifex Infotech.