Three websites utilized to make maltreatment imagery had received 100,000 monthly visits from Australians, watchdog says.
Published On 27 Nov 2025
Internet users successful Australia person been blocked from accessing respective websites that utilized artificial quality to make kid intersexual exploitation material, the country’s net regulator has announced.
The 3 “nudify” sites withdrew from Australia pursuing an authoritative warning, eSafety Commissioner Julie Inman Grant said connected Thursday.
Recommended Stories
list of 4 items- list 1 of 4Trump yanks G20 invitation from South Africa implicit mendacious genocide claims
- list 2 of 4PSV trounce Liverpool arsenic Arsenal topple Bayern portion PSG and Real besides win
- list 3 of 4Canada announces caller enactment for lumber, alloy industries deed by tariffs
- list 4 of 4Thailand’s pork manufacture fears influx of inexpensive US imports nether Trump
Grant’s bureau said the sites had been receiving astir 100,000 visits a period from Australians and featured successful high-profile cases of AI-generated kid enactment maltreatment imagery involving Australian schoolhouse students.
Grant said specified “nudify” services, which let users to marque images of existent radical look bare utilizing AI, person had a “devastating” effect successful Australian schools.
“We took enforcement enactment successful September due to the fact that this supplier failed to enactment successful safeguards to forestall its services being utilized to make kid intersexual exploitation worldly and were adjacent selling features similar undressing ‘any girl,’ and with options for ‘schoolgirl’ representation procreation and features specified arsenic ‘sex mode,’” Grand said successful a statement.
The improvement comes aft Grant’s bureau issued a ceremonial informing to the United Kingdom-based institution down the sites successful September, threatening civilian penalties of up to 49.5 cardinal Australian dollars ($32.2m) if it did not present safeguards to forestall image-based abuse.
Grant said Hugging Face, a hosting level for AI models, had separately besides taken steps to comply with Australian law, including changing its presumption of work to necessitate relationship holders to instrumentality steps to minimise the risks of misuse involving their platforms.
Australia has been astatine the forefront of planetary efforts to forestall the online harm of children, banning societal media for under-16s and cracking down connected apps utilized for stalking and creating deepfake images.
The usage of AI to make non-consensual sexually explicit images has been a increasing interest amid the accelerated proliferation of platforms susceptible of creating photo-realistic worldly astatine the click of a mouse.
In a survey carried retired by the US-based advocacy radical Thorn past year, 10 percent of respondents aged 13-20 reported knowing idiosyncratic who had deepfake nude imagery created of them, portion 6 percent said they had been a nonstop unfortunate of specified abuse.

2 weeks ago
14








English (US) ·