Photo by Poojitha Prasad.

Member-only story

96% of deepfakes are pornographic

The target for deepfakes isn’t politicians.

Jacob Bergdahl

--

Deepfakes are fake pieces of imagery, video, or audio, that have been generated with AI technologies. For every year that passes, deepfakes are becoming increasingly realistic while also becoming ever-easier to create. Using deepfakes, one can put any person’s face onto any person’s body, and make it look like they are saying or doing things that they are not.

Many fear that deepfakes will be used for political agendas. After all, anyone could create a fake video of a president saying anything, potentially sparking conflicts or even war. This has yet to happen, however. While a small number of political deepfakes exist, they have mostly been made for comedic, sensational, or informational purposes.

Deeptrace analyzed the state of deepfakes in late 2019. They discovered 14,678 deepfake videos online — which was almost double the amount of videos they found in late 2018 — of which a monstrous 96% were pornographic.

Though deepfake pornography is a rather recent invention, Deeptrace noted that deepfake videos across the top four dedicated deepfake pornography websites had amassed over 130 million views. 100% of these pornographic videos were deepfakes of women. A vast majority of these women were celebrities, mostly actresses and musicians.

While many may have assumed that deepfakes would be a weapon aimed at politicians, the reality is that deepfakes are a terrifying weapon aimed at harassing and harming women.

--

--

No responses yet