I’ve heard from someone that works in their security dep. that they go through photos to see if there’s pedophilia related content.

Trying to confirm that online but I don’t see any articles about this.

Did anyone hear something similar to this? Perhaps this is for public groups

  • catloaf@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    21 hours ago

    Typically it’s an automated scan against known CSAM. If someone in the chat reports an image, it might go to a human for review. It’s unusual that a human is going to look through your images or messages unprompted.

  • modernangel@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    22 hours ago

    The details of internal surveillance for illegal content would necessarily be nontransparent, because you really don’t want users exploiting loopholes to trade that shit with impunity.

  • Biyoo@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    0
    ·
    23 hours ago

    The feds have a database of cp and basically every major service, WhatsApp probably included, use it to check for a match. I’m guessing he meant this. I don’t think WhatsApp actually scans the images outside of that tho.