UK specialists say that ChatGPT is prompting an increase in reports of organised ritual abuse, as victims of so‑called “satanic” sexual violence turn to the AI system for therapeutic help.
Police contend that organised ritual abuse and “witchcraft, spirit possession and spiritual abuse” (WSPRA) targeting children are largely hidden in the UK. No contemporary offence directly addresses the conduct, yet it is characterised by sexual assault, violence and neglect combined with ritualistic features – occasionally drawn from satanic, fascist or occult doctrines – used to dominate victims.
Perpetrators range from abusive families and organised networks to human‑trafficking groups, internet‑based gangs and paedophile circles.
Since 1982, fourteen criminal cases in the UK have formally recognised ritualistic elements in sexual abuse. Yet a 2025 study by clinical psychologist Dr Elly Hanson concluded that convictions represent only the “tip of the iceberg”.
Experts are now introducing training for police services, led by the National Police Chiefs’ Council (NPCC), which has established a dedicated working group.
Gabrielle Shaw, chief executive of the National Association of People Abused in Childhood (Napac), noted a “sustained rise” in ritual‑abuse reports over the past eighteen months, with more individuals indicating that AI prompted them to come forward.
Shaw explained: “In the past six to twelve months, callers to the Napac helpline tell us, ‘I was referred to you by ChatGPT.’ People are employing AI, specifically ChatGPT, as a means of therapy and self‑exploration. Opinions vary, but if it opens a pathway to assistance, that is beneficial.”
She added: “Typically we observe call surges on dates with strong supernatural or religious connotations – this is not a surge but a steady increase. Awareness of the offence and of available support is growing … references to satanism appear frequently.”
The NPCC, Napac and the Hydrant policing programme – which assists forces across the country with child‑protection matters – commissioned Hanson’s review last year and this month released a WSPRA briefing for practitioners.
Earlier this year, members of a Scottish paedophile ring who masqueraded as witches and wizards received prison sentences for sexual crimes.
Shaw reported that of 36,700 calls to Napac over nine years, 1,310 referenced organised ritual abuse. She noted the offending can be “intergenerational”, and although most perpetrators are male, survivors have identified “grandmothers and aunts” as abusers.
Richard Fewkes, director of the Hydrant Programme, said that the “fantastical” nature of ritual elements has widened the justice gap.
He continued: “We must overhaul the system’s response – the problem exists, it is not being reported to police … we have been aware of it for many years.”
Read next
Study finds AI helps hackers uncover anonymous social media profiles.
AI has made it significantly simpler for bad actors to pinpoint anonymous social‑media profiles, a recent study warns.
In most trial conditions, large language models (LLMs) – the technology underlying tools such as ChatGPT – correctly linked anonymous online users to their real identities on other services, using the material they
Analysis reveals AI chatbots steer at‑risk social media users toward illegal online casinos.
AI chatbots are directing vulnerable social‑media users toward illegal online gambling sites, increasing their exposure to fraud, dependency and even self‑harm.
An assessment of five AI services from several of the world’s biggest technology firms showed that each could readily be asked to name the “top” unlicensed
X will prohibit users from earning revenue for posting unlabeled AI‑generated war videos.
Elon Musk’s platform X will prohibit users from earning revenue if they repeatedly share AI‑generated war videos without labeling them, following a surge of fabricated battle footage related to the Iran conflict.
With roughly five hundred million monthly users, X will bar creators from receiving payment for ninety