• Latest
  • Trending
  • All
Accounts peddling child abuse content flood some X hashtags as safety partner Thorn cuts ties

Accounts peddling child abuse content flood some X hashtags as safety partner Thorn cuts ties

June 18, 2025
Trump administration to close LGBTQ+ suicide hotline program next month

Trump administration to close LGBTQ+ suicide hotline program next month

June 19, 2025
Trump's Iran dilemma exposes bitter split in president's circle

Trump’s Iran dilemma exposes bitter split in president’s circle

June 18, 2025
Need summer travel inspiration? Celebrate Caribbean Heritage Month, Black Music Month & Juneteenth in these top destinations

Need summer travel inspiration? Celebrate Caribbean Heritage Month, Black Music Month & Juneteenth in these top destinations

June 19, 2025
Iraqi teachings as Trump makes threats against Iran

Iraqi teachings as Trump makes threats against Iran

June 18, 2025
The Trump White House’s June 15th anniversary highlights how much of a year can be.

The Trump White House’s June 15th anniversary highlights how much of a year can be.

June 18, 2025
Honoring freedom: Valuable ways to observe June 15th both online and in person

Honoring freedom: Valuable ways to observe June 15th both online and in person

June 18, 2025
From Ukraine to Greenland, Are Trump’s Geopolitical Ambitions Driven by Mining?

Trump’s antiwar legacy was often a misconception

June 18, 2025
Israeli strikes kill 140 in Gaza in 24 hours as focus shifts to Iran conflict

Israeli strikes kill 140 in Gaza in 24 hours as focus shifts to Iran conflict

June 18, 2025
From Ukraine to Greenland, Are Trump’s Geopolitical Ambitions Driven by Mining?

Iranian nuclear weapons were certainly demonstrated to the IAEA, the head of the organization said.

June 18, 2025
Trump administration reduces specialized suicide protection programs for LGBTQ+ children.

Trump administration reduces specialized suicide protection programs for LGBTQ+ children.

June 18, 2025
From Ukraine to Greenland, Are Trump’s Geopolitical Ambitions Driven by Mining?

Critics decry SCOTUS’s decision to support the Tennessee restrictions on gender-affirming care

June 18, 2025
Blueberries, strawberries, and blackberries in three carton containers side by side

Eating more berries, flavanoids may be key

June 18, 2025
  • About Us
  • Contact Us
  • Terms & Conditions
  • Privacy Policy
  • Faith
  • Finance and Trade
  • Our Voices
  • The Watchlist
  • Uncategorized
Thursday, June 19, 2025
It's That Part™
  • Home
  • Our Voices
  • World News
  • Latest News
  • Commentary
Advertisement
ADVERTISEMENT
No Result
View All Result
  • Home
  • Our Voices
  • World News
  • Latest News
  • Commentary
No Result
View All Result
It's That Part™
No Result
View All Result
Home Latest News

Accounts peddling child abuse content flood some X hashtags as safety partner Thorn cuts ties

by Jesse It’s That Part
June 18, 2025
in Latest News
0
Accounts peddling child abuse content flood some X hashtags as safety partner Thorn cuts ties
491
SHARES
1.4k
VIEWS
Share on FacebookShare on Twitter
Loose Weight and much more! Loose Weight and much more! Loose Weight and much more!
Create a better and healthier you! Create a better and healthier you! Create a better and healthier you!


When Elon Musk took over Twitter in 2022, he said that addressing the problem of child sexual abuse material on the platform was his “top priority.” Three years later, the problem appears to be escalating, as anonymous, seemingly automated X accounts flood hashtags with hundreds of posts per hour advertising the sale of the illegal material.At the same time, Thorn, a California-based nonprofit that works with tech companies to provide technology that can detect and address child sexual abuse content, told NBC News that it had terminated its contract with X. Thorn said that X stopped paying recent invoices for its work, though it declined to provide details about its deal with the company citing legal sensitivities.

Some of Thorn’s tools are designed to address the very issue that appears to be growing on the platform.

“We recently terminated our contract with X due to nonpayment,” Cassie Coccaro, head of communications at Thorn, told NBC News. “And that was after months and months of outreach, flexibility, trying to make it work. And ultimately we had to stop the contract.”

In response to requests for comment, X did not address its relationship with Thorn or the ongoing issue of accounts using the platform to market child sexual abuse material (CSAM).

X app
The X app on a phone.Jaap Arriens / NurPhoto via Getty Images file

Many aspects of the child exploitation ads issue, which NBC News first reported on in January 2023, remain the same on the platform. Sellers of child sexual abuse material (CSAM) continue to use hashtags based on sexual keywords to advertise to people looking to buy CSAM. Their posts direct prospective buyers to other platforms where users are asked for money in return for the child abuse material.

Other aspects are new: Some accounts now appear to be automated (also known as bots), while others have taken advantage of “Communities,” a relatively new feature launched in 2021 that encourages X users to congregate in groups “closer to the discussions they care about most.” Using Communities, CSAM advertisers have been able to post into groups of tens of thousands of people devoted to topics like incest, seemingly without much scrutiny.

The Canadian Centre for Child Protection (C3P), an independent online CSAM watchdog group, reviewed several X accounts and hashtags flagged by NBC News that were promoting the sale of CSAM, and followed links promoted by several of the accounts. The organization said that, within minutes, it was able to identify accounts that posted images of previously identified CSAM victims who were as young as 7. It also found apparent images of CSAM in thumbnail previews populated on X and in links to Telegram channels where CSAM videos were posted. One such channel showed a video of a boy estimated to be as young as 4 being sexually assaulted. NBC News did not view or have in its possession any of the abuse material.

Lloyd Richardson, director of information technology at C3P, said the behavior being exhibited by the X users was “a bit old hat” at this point, and that X’s response “has been woefully insufficient.” “It seems to be a little bit of a game of Whac-A-Mole that goes on,” he said. “There doesn’t seem to be a particular push to really get to the root cause of the issue.”X says it has a zero tolerance policy “towards any material that features or promotes child sexual exploitation.”

A review of many hashtags with terms known to be associated with CSAM shows that the problem is, if anything, worse than when Musk initially took over. What was previously a trickle of posts of fewer than a dozen per hour is now a torrent propelled by accounts that appear to be automated — some posting several times a minute.

Despite the continued flood of posts and sporadic bans of individual accounts, the hashtags observed by NBC News over several weeks remained open and viewable as of Wednesday. And some of the hashtags that were identified in 2023 by NBC News as hosting the child exploitation advertisements are still being used for the same purpose today.

Historically, Twitter and then X have attempted to block certain hashtags associated with child exploitation. When NBC News first reported on the use of X to market CSAM, X’s head of trust and safety said the company knew it had work to do and would be making changes, including the development of automated systems to detect and block hashtags.

In January 2024, X CEO Linda Yaccarino testified to the Senate Judiciary Committee that the company had strengthened its enforcement “with more tools and technology to prevent bad actors from distributing, searching for, or engaging with [child sexual exploitation] content across all forms of media.”

Linda Yaccarino speaks while seated
X CEO Linda Yaccarino, right, testifies during a Senate Judiciary Committee hearing on Capitol Hill in 2024.Manuel Balce Ceneta / AP file

In May 2024, X said it helped Thorn test a tool to “proactively detect text-based child sexual exploitation.” The “self-hosted solution was deployed seamlessly into our detection mechanisms, allowing us to hone in on high-risk accounts and expand child sexual exploitation text detection coverage, X said”

Pailes Halai, Thorn’s senior manager of accounts and partnerships, who oversaw the X contract, said that some of Thorn’s software was designed to address issues like those posed by the hashtag CSAM posts, but that it wasn’t clear if they ever fully implemented it.

“They took part in the beta with us last year,” he said. “So they helped us test and refine, etc, and essentially be an early adopter of the product. They then subsequently did move on to being a full customer of the product, but it’s not very clear to us at this point how and if they used it.”

Without Thorn, it’s not entirely clear what child safety mechanisms X is currently employing. “Our technology is designed with safety in mind,” Halai said. “It’s up to the platform to enforce and use the technology appropriately … What we do know on our side is it’s designed to catch the very harms that you’re talking about.”

Halai said Thorn didn’t take the termination of its contract with X lightly.

“It was very much a last-resort decision for us to make,” he said. “We provided the services to them. We did it for as long as we possibly could, exhausted all possible avenues and had to terminate, ultimately, because, as a nonprofit, we’re not exactly in the business of helping to sustain something for a company like X, where we’re actually incurring huge costs.”

Currently, some hashtags, like #childporn, are blocked when using X’s search function, but other hashtags are open to browse and are filled with posts advertising CSAM for sale. NBC News found posts appearing to peddle CSAM in 23 hashtags that are oftentimes used together in the posts. NBC News only identified two hashtags that were blocked by X. The hashtags that were available to be posted to and viewed during an NBC News’ review of the platform ranged from references to incest and teenagers to slightly more coded terms, like combinations of words with the name of the defunct video chat platform Omegle, which shut down in 2023 after a child sex exploitation lawsuit. Some hashtags contained jumbled letters and only contained posts advertising CSAM, indicating that they were created with the exclusive purpose of housing the advertisements.

Some usernames of accounts posting the ads were simply a jumble of words associated with CSAM content on the platform, mixing names of social media platforms with other keywords.

Many of the users linked directly to Telegram channels in their posts or their account bios and included explicit references to CSAM. Some posts linked to Discord channels or solicited direct messages to secure Discord links.

Telegram and Discord have distinct positions in the internet’s child exploitation ecosystem, offering semiprivate and private venues for people looking to sell or buy child exploitation material. NBC News previously reported on 35 cases in which adults were prosecuted on charges of kidnapping, grooming or sexual assault that allegedly involved communications on Discord.

A Discord representative said, “”Discord has zero tolerance for child sexual abuse material, and we take immediate action when we become aware of it, including removing content, banning users, and reporting to the National Center for Missing and Exploited Children (NCMEC).” The company said in response to NBC News’ outreach that it removed multiple servers “for policy violations unrelated to the sale of CSAM.”

A representative for Telegram said “CSAM is explicitly forbidden by Telegram’s terms of service and such content is removed whenever discovered.” The representative pointed to the company’s partnership with the U.K.-based Internet Watch Foundation, which maintains a database of known CSAM and provides tools to detect and remove it.

While some of the X accounts posted publicly, others solicited and offered CSAM through X’s Communities feature, where users create groups based on specific topics. NBC News observed groups with tens of thousands of members in which CSAM was solicited or was offered to be sold.

In a group with over 70,000 members devoted to “incest confessions,” multiple users posted multiple times linking to Telegram channels, explicitly referencing CSAM. “I’m selling 6cp folder for only 90$,” one user wrote, linking to a Telegram account. CP is a common online abbreviation for “child pornography.”

CSAM has been a perpetual problem on the internet and social media, with many companies employing specialized teams and building automated systems to identify and remove abuse content and those spreading it.

But Musk also instituted drastic cuts to the company’s trust and safety teams, and disbanded the company’s Trust and Safety Council. In 2023, the company said that it was detecting more CSAM than in previous years and that it had increased staffing devoted to the issue despite larger trust and safety layoffs.

Elon Musk walks while holding a sink inside of an office
A video grab taken from a video posted on the X account of Elon Musk in 2022.Elon Musk via AFP – Getty Images file

Richardson, C3P’s director of information technology, said that while X will sometimes remove accounts that are flagged to it for violating rules around CSAM, “a new account pops up in two seconds, so there’s not a lot of in depth remediation to the problem. That’s just sort of the bare minimum that we’re looking at here.”

He said an increasing reliance on artificial intelligence systems for moderation, if X is using them, could be in part to blame for such oversights. According to Richardson, AI systems are good at sorting through large datasets and flagging potential issues, but that, currently, systems will inevitably over- or under-moderate without human judgment at the end.

“There should be an actual incident response when someone is selling child sexual abuse material on your service, right? We’ve become completely desensitized to that. We’re dealing with the sale of children being raped,” Richardson said. “You can’t automate your way out of this problem.”



Source link-

Share196Tweet123Share49
Create a healthier you! Create a healthier you! Create a healthier you!
ADVERTISEMENT
Jesse It’s That Part

Jesse It’s That Part

  • Trending
  • Comments
  • Latest
Trump’s Failed Attempt to Confront South Africa’s President

Trump’s Failed Attempt to Confront South Africa’s President

May 21, 2025
33 Shocking Photos Shown to Diddy’s Federal Trial Jury

33 Shocking Photos Shown to Diddy’s Federal Trial Jury

May 21, 2025
Trump meets with German Chancellor Merz at the White House

Trump meets with German Chancellor Merz at the White House

June 5, 2025
Maori MPs face suspension after haka protest in New Zealand parliament

Maori MPs face suspension after haka protest in New Zealand parliament

0
FDA fluoride ban proposal stuns dentists and scientists amid health concerns

FDA fluoride ban proposal stuns dentists and scientists amid health concerns

0
WHO adopts global pandemic accord, but US absence raises concerns

WHO adopts global pandemic accord, but US absence raises concerns

0
Trump administration to close LGBTQ+ suicide hotline program next month

Trump administration to close LGBTQ+ suicide hotline program next month

June 19, 2025
Trump's Iran dilemma exposes bitter split in president's circle

Trump’s Iran dilemma exposes bitter split in president’s circle

June 18, 2025
Need summer travel inspiration? Celebrate Caribbean Heritage Month, Black Music Month & Juneteenth in these top destinations

Need summer travel inspiration? Celebrate Caribbean Heritage Month, Black Music Month & Juneteenth in these top destinations

June 19, 2025
Experience sustained energy, improved gut health, enhanced focus, and burn 400 calories for 9 hours straight! Experience sustained energy, improved gut health, enhanced focus, and burn 400 calories for 9 hours straight! Experience sustained energy, improved gut health, enhanced focus, and burn 400 calories for 9 hours straight!
ADVERTISEMENT
It's That Part™

Copyright © 2025 It's That Part.

Navigate Site

  • About Us
  • Contact Us
  • Terms & Conditions
  • Privacy Policy
  • Faith
  • Finance and Trade
  • Our Voices
  • The Watchlist
  • Uncategorized

Follow Us

No Result
View All Result
  • Home
  • Our Voices
  • World News
  • Latest News
  • Commentary

Copyright © 2025 It's That Part.