I’ve heard from someone that works in their security dep. that they go through photos to see if there’s pedophilia related content.
Trying to confirm that online but I don’t see any articles about this.
Did anyone hear something similar to this? Perhaps this is for public groups
Typically it’s an automated scan against known CSAM. If someone in the chat reports an image, it might go to a human for review. It’s unusual that a human is going to look through your images or messages unprompted.
Its owned by Facebook. I don’t know what anybody expected differently.
The details of internal surveillance for illegal content would necessarily be nontransparent, because you really don’t want users exploiting loopholes to trade that shit with impunity.
This surely won’t be abused in any way \s
I’m super ok with them doing that
The feds have a database of cp and basically every major service, WhatsApp probably included, use it to check for a match. I’m guessing he meant this. I don’t think WhatsApp actually scans the images outside of that tho.