
Despite Elon Musk’s claims that child abuse content on Twitter is being aggressively removed, a report claims that child sexual abuse imagery (CSAM) still exists on the micro-blogging platform.
The New York Times discovered photographs of ten child abuse victims in 150 instances “across several accounts” on Twitter.
“Child sexual assault imagery spreads on Twitter even after the firm is notified: One video received 120,000 views,” according to the investigation.
Meanwhile, the Canadian Center for Child Protection discovered 260 of Twitter’s “most explicit films” in its database, which had over 174,000 likes and 63,000 retweets in total.
According to the research, Twitter’s recommendation algorithm boosts some of the photographs.
Following notification from the Canadian Center for Child Protection, the site apparently removed some of the troubling content.
“The number of CSAM we’re able to locate with minimal effort is extremely considerable,” said Lloyd Richardson, technical director at the Canadian center.
“It shouldn’t be the business of outsiders to locate this kind of content on their system,” Richardson was reported as saying.
Musk cut 15% of its trust and safety employees (content moderation) in November of last year, saying it would have no effect on moderation.
Twitter announced earlier this month that it is “proactively and severely limiting the reach” of CSAM content and that it will attempt to “delete the content and suspend the bad actor(s) involved.”
Musk previously stated that eradicating child abuse content is his number one goal.
0 Comments