Stanford researchers unveil Mastodon CSAM crisis
According to recent Stanford’s Internet Observatory research, Mastodon, the decentralized network seen as a viable alternative to Twitter, is crowded with child sexual abuse material (CSAM). Researchers searched 325,000 posts on the network in just two days and discovered 112 instances of known CSAM, with the first one turning up after only five minutes of searching.
According to The Verge, The Internet Observatory searched the 25 most popular Mastodon instances for CSAM as part of their investigation. Researchers also used PhotoDNA, a technique that locates CSAM that has been identified, and Google’s SafeSearch API to find explicit photos.
The team’s search yielded 554 pieces of content, all of which were classified as explicit in the “highest confidence” by Google SafeSearch, matching hashtags or keywords frequently used by online groups that promote child sexual abuse.
Mastodon
There were 713 uses of the hashtags
The top 20 CSAM-related hashtags were also used 713 times on postings with media across the Fediverse, and 1,217 text-only posts referred to “off-site CSAM trading or grooming of minors.” The open posting of CSAM is described in the report as being “disturbingly prevalent.”
One instance cited the prolonged mastodon.xyz server failure as an instance of an event brought on by CSAM published on Mastodon. The server’s sole administrator claimed they were informed of CSAM-containing content in a post about the incident. Still, he also noted that moderation is done in his spare time and can take up to a few days to complete — this isn’t a massive operation like Meta with a global team of contractors; it’s just one person.
Even though they claimed to have taken action against the offending content, the mastodon.xyz domain’s host had nonetheless suspended it, rendering the server unreachable to users until they could get in touch with someone to get its listing restored. Mastodon.xyz’s administrator reports that after the problem was fixed, the registrar moved the domain to a “false positive” list to stop further takedowns. The researchers note, however, that “what caused the action was not a false positive.”
Thank you for being a Ghacks reader. The post Stanford researchers unveil Mastodon CSAM crisis appeared first on gHacks Technology News.
gHacks Technology News