Not a good look for Mastodon - what can be done to automate the removal of CSAM?

  • Arotrios
    link
    fedilink
    61 year ago

    Sweet - thanks - that’s a brilliant tool. Bookmarked.

      • Arotrios
        link
        fedilink
        41 year ago

        Thanks for the comment - I wasn’t aware of a cloudflare controversy in play, and went through your links and the associated wikipedia page. It’s interesting to me, as someone who previously ran a public forum, to see them struggle with the same issues surrounding hate speech I did on a larger scale.

        I agree with your thoughts on a centralized service having that much power, although Cloudfare does have a number of competitors, so I’m not quite seeing the risk here, save for the fact that Cloudfare appears to be the only one offering CSAM filtering (will have to dig in further to confirm). The ActivityPub blocking for particular instances is concerning, but I don’t see a source on that - do you have more detail?

        However, I disagree with your statement on handling non-solicited content - from personal experience, I can confidently state that there are some things that get submitted that you just shouldn’t subject another human too, even if it’s only to determine whether or not it should be deleted. CSAM falls under this category in my book. Having a system in place that keeps you and your moderators from having to deal with it is invaluable to a small instance owner.

      • P03 Locke
        link
        fedilink
        31 year ago

        I trust CloudFlare a helluva lot more than I trust most of these companies discussed on this thread. Their transparency is second to none.