I’m sort of surprised companies haven’t started to implement some hidden visual verification in their broadcasts. So they can prove deepfakes of it aren’t real.
They won’t do that because lots of customers won’t pay for an AI service if they can’t use the material to trick and defraud. Propaganda and misinformation is one of the biggest selling points for AI. Also they don’t want their company’s watermark on the demented, often illegal, sexual material that people make with these things.
I’m sort of surprised companies haven’t started to implement some hidden visual verification in their broadcasts. So they can prove deepfakes of it aren’t real.
They won’t do that because lots of customers won’t pay for an AI service if they can’t use the material to trick and defraud. Propaganda and misinformation is one of the biggest selling points for AI. Also they don’t want their company’s watermark on the demented, often illegal, sexual material that people make with these things.