corb3t@lemmy.world to Technology@lemmy.ml · 2 years agoStanford researchers find Mastodon has a massive child abuse material problemwww.theverge.comexternal-linkmessage-square233linkfedilinkarrow-up1265arrow-down169file-textcross-posted to: [email protected][email protected][email protected][email protected][email protected][email protected][email protected]
arrow-up1196arrow-down1external-linkStanford researchers find Mastodon has a massive child abuse material problemwww.theverge.comcorb3t@lemmy.world to Technology@lemmy.ml · 2 years agomessage-square233linkfedilinkfile-textcross-posted to: [email protected][email protected][email protected][email protected][email protected][email protected][email protected]
minus-squareballs_expert@lemmy.blahaj.zonelinkfedilinkarrow-up5arrow-down1·2 years agoThere is a database of known files of CSAM and their hashes, mastodon could implement a filter for those at the posting interaction and when federating content Shadow banning those users would be nice too
minus-squarediffuselight@lemmy.worldlinkfedilinkarrow-up1·2 years agoThey are talking about AI generated images. That’s the volume part.
There is a database of known files of CSAM and their hashes, mastodon could implement a filter for those at the posting interaction and when federating content
Shadow banning those users would be nice too
They are talking about AI generated images. That’s the volume part.