Recently, I was shown dozens of small pictures of Donald Trump, some real, others digitally created. I found it impossible to distinguish among them. Asked to pick three that were possibly fake, I got only one right.
The exercise was an introduction to the looming security threat of “deepfakes”, the artificial intelligence-powered imitation of speech and images to create alternative realities, making someone appear to be saying or doing things they never said or did.
In their simplest form, deepfakes are achieved by giving a computer instructions and feeding it images and audio of a person to teach it to imitate that person’s voice (and possibly much more). There is already an app for that: FakeApp (and video tutorials on how to use it), and an underground digital community that is superimposing celebrity faces on to actors in porn videos.