Mastodon, an alternative social network to Twitter, has a serious problem with child sexual abuse material according to researchers from Stanford University. In just two days, researchers found over 100 instances of known CSAM across over 325,000 posts on Mastodon. The researchers found hundreds of posts containing CSAM related hashtags and links pointing to CSAM trading and grooming of minors. One Mastodon server was even taken down for a period of time due to CSAM being posted. The researchers suggest that decentralized networks like Mastodon need to implement more robust moderation tools and reporting mechanisms to address the prevalence of CSAM.

  • sub_o@beehaw.org
    link
    fedilink
    English
    arrow-up
    8
    ·
    1 year ago

    I think some of the problematic instances have been defederated, IIRC there’s a large japanese instance that was defederated long time ago due to child abuse content. But still since I’ve been seeing increases of hate speech and dog whistling misogyny and homophobia in some instances, I won’t be surprised if CSAM stuff has been trading under our noses.

    The main issue is that, with so many users nowadays and small moderation teams, especially in the larger instances, it’s hard to moderate and tackle CSAM problems effectively. I really wish larger instances would limit user registrations or start splitting off into smaller manageable ones.

    Also, since they are trading using certain hashtags, blocking those hashtags might not be a bad idea.