Member-only story
AI-generated imagery: you might never be able to trust the internet again
AI-generated art by Midjourney and Stable Diffusion is just the tip of the iceberg

Ever since the rise of deepfakes, citizens and governments have gotten increasingly concerned about fake photos and videos. Everyone seemed to be in danger: people with the right to vote might be target of political disinformation campaigns using deepfakes; people in court might be unable to distinguish a false piece of video evidence from a true one; people with jealous partners might find their faces appear in a porn clip — and struggle to explain that they never made that clip.
Some technologists dismissed the huge danger that deepfakes posed. Pictures have always been tampered with, they said, so what would make deepfakes more dangerous than Photoshop or the film industry in Hollywood?
The answer is their scale. Prior to the existence of deepfakes, editors had to spend hours to modify a single photo or video in a convincing way. Now, a computer can generate a deepfake in a matter of seconds.
This opened the floodgates for disinformation campaigns. Most recently, a deepfake got widespread attention in which Ukrainian president Zelenskyy told his soldiers to lay down his arms. It was quickly dismissed; but you can probably picture what might have happened if the Ukrainian government hadn’t reacted to it quicker.
Luckily for society, however, deepfakes are fairly easy to spot. The movement in these videos tends to be a little bit unnatural, and sometimes body parts don’t seem to fit together properly. Furthermore, deepfakes tend to be bad at showing people from a side angle or waving their hands in front of their faces.
All this might change with upcoming image generation technology.
Why deep learning image synthesis is so powerful
Deepfakes are useful to combine two or more pieces of media to form a new one. For example, if we wanted to generate a video of Barack Obama saying things he never said, we need a voice actor to make the soundtrack, and a short video of Obama saying something else.