Mastodon, an alternative social network to Twitter, has a serious problem with child sexual abuse material according to researchers from Stanford University. In just two days, researchers found over 100 instances of known CSAM across over 325,000 posts on Mastodon. The researchers found hundreds of posts containing CSAM related hashtags and links pointing to CSAM trading and grooming of minors. One Mastodon server was even taken down for a period of time due to CSAM being posted. The researchers suggest that decentralized networks like Mastodon need to implement more robust moderation tools and reporting mechanisms to address the prevalence of CSAM.
This is kind of problematic… By creating a community driven hashlist that is freely shared, you’ve also kind of created an index of CSAM content that could easily be extrapolated for people actively looking to find/share that content.
Surely a list of hashes wouldn’t be that useful?
That is the standard approach for detecing CSAM media.
It is often freely implemented with tools: https://blog.cloudflare.com/the-csam-scanning-tool/ (and this also gets into the how the hashes work).
The hashes themselves tend to be restricted to law enforcement agencies and trusted vendors/partners/services.
only if they are crypto hashes (hash functions that back btc, ltc, other cryptos) as they are irreversible*
*i wont explain, use your internet in the pocket
Super useful, it’s very similar to how magnet links for torrenting works. I know of a few less popular file sharing services that can act and search for files based on hash alone.
A lot of other areas online make use of hashes as identifiers already too. If you search for a hash of a file you’ve downloaded, just the hash and nothing else, there’s a very good chance you’ll get multiple results.
Doesn’t anyone looking for that material already know what to look for?