Mastodon, an alternative social network to Twitter, has a serious problem with child sexual abuse material according to researchers from Stanford University. In just two days, researchers found over 100 instances of known CSAM across over 325,000 posts on Mastodon. The researchers found hundreds of posts containing CSAM related hashtags and links pointing to CSAM trading and grooming of minors. One Mastodon server was even taken down for a period of time due to CSAM being posted. The researchers suggest that decentralized networks like Mastodon need to implement more robust moderation tools and reporting mechanisms to address the prevalence of CSAM.

  • abhibeckert@beehaw.org
    link
    fedilink
    arrow-up
    10
    ·
    edit-2
    1 year ago

    I agree, but who’s going to pay for it?

    How about police/the tax payer?

    If university researchers can find the stuff, then police can find it too. There should be an established way to flag the user (or even the entire instance) so that content can be removed from the fediverse while simultaneously asking for all data that is available to try to catch the criminals.

    And of course, if regular users come across anything illegal they will report it too, and it should be removed quickly (I’d hope immediately in many cases, especially if the post was by a brand new/untrusted account).