
The Alarming Rise Of AI Apps Creating Explicit Images Of Real People
As AI evolves, it’s brought with it an alarming rise in “nudify” tools – hundreds of apps and sites that can now easily make fake explicit imagery from photos of anyone, including children. One man in Minneapolis used one of these sites to create graphic content of more than 80 women, including many of his close friends. Three of these women sat down with CNBC to tell their stories of how these fake photos and videos have caused irreversible harm. Often marketed as simple face-swappers, nudify tools are everywhere, from the Apple and Google app stores to ads on Facebook and Instagram. It’s legal to create nonconsensual deepfakes, but now this group of friends and a Minnesota state senator is trying to change that.
Warning: This story contains sensitive content.
Chapters:
0:00 Introduction
2:16 Finding fake nudes
6:40 Easy, cheap and profitable
8:50 Role of big tech
12:36 Fighting back
Credit to : CNBC