By Faith Barbara N Ruhinda at 1214 EAT on Tuesday 26 August 2025

A survivor of child sexual abuse has made a direct plea to Elon Musk, urging him to stop links to images of her abuse from being circulated on his social media platform, X (formerly Twitter).
“Hearing that my abuse – and the abuse of so many others – is still being circulated and commodified here is infuriating,” says “Zora” (a pseudonym), who lives in the United States and was first abused more than 20 years ago.
“Every time someone sells or shares child abuse material, they directly fuel the original, horrific abuse.”
Journalists contacted the seller via the messaging app Telegram, which led to a bank account in Jakarta, Indonesia.
Zora, who was abused by a family member, says a series of images depicting her abuse have become widely known and circulated among online networks of child abusers. Her experience is far from unique, as countless victims continue to suffer from the ongoing distribution of images documenting their trauma.
“My body is not a commodity. It never has been, and it never will be,” she says.
“Those who distribute this material are not passive bystanders – they are complicit perpetrators.”

X maintains that tackling child exploitation remains “a top priority” for the company. However, Zora and other victims say far more needs to be done to ensure their abuse is not continually retraumatized by the unchecked spread of such material online.
The abuse Zora endured as a child was once hidden in the depths of the so-called dark web. Now, more than two decades later, she faces the devastating reality that links to those same images are being openly promoted on X, formerly known as Twitter.
While social media platforms continue efforts to rid their services of illegal content, experts warn the scale of the problem is staggering.
In 2024 alone, the US National Center for Missing and Exploited Children (NCMEC) received more than 20 million mandatory reports from tech companies regarding incidents of child sexual abuse material (CSAM)—illegal images and videos being shared on their platforms.
NCMEC works to identify both victims and perpetrators, and refers cases to law enforcement agencies around the world.
As part of our investigation, we contacted members of the hacktivist collective Anonymous, who have been tracking accounts involved in the trade of child abuse content on X. One member told us the situation remains as dire as ever.
They directed us to a specific account on X, which used a head-and-shoulders image of a real child as its profile photo. While the image itself was not overtly explicit, the bio contained coded language and emojis indicating the user was selling child sexual abuse material. The account also included a direct link to a Telegram profile.
A trader, believed to be operating from Indonesia, has been offering so-called “VIP packages”—collections of child sexual abuse images and video files—for sale to paedophiles across the globe.
An activist from the hacktivist group Anonymous has been working to report and take down the trader’s numerous accounts on X. However, each time an account was removed by the platform’s moderation systems, a new one would quickly take its place.
According to the activist, the trader appeared to be managing more than 100 nearly identical accounts. When contacted directly through the messaging app Telegram, the trader claimed to have a vast collection of material for sale.
“I have baby. Kids young 7–12,” he wrote in a message. He also stated that some of the files depicted child rape.
Reporters posing as potential buyers contacted the trader, who responded by providing links to sample content. The material was not opened or viewed. Instead, the links were referred to the Canadian Centre for Child Protection (C3P) in Winnipeg, which works in coordination with law enforcement and is legally permitted to examine such content to assist in victim identification and investigations.
“The Telegram account was, for lack of a better term, a taster pack—essentially a collage of the material he had available, featuring images of numerous victims,” said Lloyd Richardson, Director of Technology at the Canadian Centre for Child Protection (C3P). “When we examined the collages, it was clear there were thousands of images.”
Among them were images of Zora.
Her abuser in the United States was prosecuted and imprisoned many years ago. But by then, the footage of her abuse had already been circulated and sold worldwide.
“I’ve tried for years to move past my trauma and not let it define my life,” Zora said. “But perpetrators and stalkers still manage to find ways to access and share this filth.”

As she grew older, stalkers uncovered her identity and began harassing her online. Some made direct threats.
“I feel like I’m being bullied for a crime that stole my childhood,” she added.
To trace the individual selling images of Zora’s abuse, journalists posed as potential buyers.
The trader responded by sharing both bank account details and an online payment handle—each registered under the same name.
An activist from Anonymous had independently discovered that this same name was linked to two previous money transfers and an additional bank account.
Further investigation led to a man with that name residing on the outskirts of Jakarta, Indonesia.
A local producer visited the address and confronted a man at the property. When presented with the evidence, he appeared surprised and said, “I don’t know anything about this.”
He confirmed ownership of one of the bank accounts but claimed it had only been used for a single mortgage-related transaction. He said he had not accessed it since and would contact his bank to investigate any misuse. He denied any knowledge of the second account or the transactions linked to it.
At this stage, it remains unclear whether, or to what extent, the man may be involved in the illegal activity. As a result, his identity is being withheld.
The marketing of Zora’s abuse images follows a pattern used by hundreds of traders globally, according to findings from this investigation.
Posts promoting illegal material often appear on X under specific hashtags known within paedophile networks. While some images shared on the platform are taken from known child sexual abuse material, they are frequently cropped or edited in ways that avoid violating automated moderation systems—allowing them to remain online and serve as advertisements for more explicit content.
When Elon Musk acquired X (formerly Twitter) in 2022, he publicly declared that eliminating child sexual abuse material was his “top priority.” Despite this, the continued circulation of such content suggests persistent gaps in enforcement and content moderation.
Social media platforms—not just X—must do far more to prevent repeat offenders from exploiting their systems, says Lloyd Richardson, Director of Technology at the Canadian Centre for Child Protection (C3P).
“It’s great that we can send a takedown notice and have a platform remove an account, but that’s the bare minimum,” he said.
The larger issue, Richardson explained, is that abusers can often return within days using new accounts, continuing to post and promote illegal content with little resistance.
In response, X said it maintains a “zero tolerance” policy for child sexual exploitation.
“We continually invest in advanced detection to enable us to take swift action against content and accounts that violate our rules,” a spokesperson said.
The company added that it works “closely with the National Center for Missing and Exploited Children (NCMEC)” and supports law enforcement in efforts to prosecute offenders.
Telegram, another platform used in the trade of child abuse material, said it also has strict enforcement in place.
“All channels are moderated, and more than 565,000 groups and channels related to the spread of CSAM have been banned so far in 2025,” a spokesperson said.
The platform reported employing more than 1,000 moderators and stated: “Telegram proactively monitors public content across the platform and removes objectionable material before it can reach users or be reported.”
When informed that her images were still being circulated on X, Zora shared a direct message for the platform’s owner, Elon Musk:
“Our abuse is being shared, traded, and sold on the app you own. If you would act without hesitation to protect your own children, I beg you to do the same for the rest of us. The time to act is now.”
Invest or Donate towards HICGI New Agency Global Media Establishment – Watch video here
Email: editorial@hicginewsagency.com TalkBusiness@hicginewsagency.com WhatsApp +256713137566
Follow us on all social media, type “HICGI News Agency” .
