Fighting Deepfakes: Whats Being Done? Biden Robocalls to Taylor Swift AI Images
The world is awash in deepfakes — video, audio or pictures in which people appear to do or say things they didn’t, or be somewhere they weren’t. Most deepfakes are explicit videos and images concocted by mapping the face of a celebrity onto the body of someone else. Some are used to scam consumers, or to damage the reputations of politicians and other people in the public eye. Advances in artificial intelligence mean it takes just a few taps on a keyboard to conjure them up. Alarmed governments are looking for ways to fight back.
On Feb. 8, the US Federal Communications Commission made it illegal for companies to use AI-generated voices in robocalls. The ban came two days after the FCC issued a cease-and-desist order against the company responsible for an audio deepfake of President Joe Biden. New Hampshire residents received a robocall before the state’s presidential primary that sounded like Biden urging them to stay home and “save your vote for the November election.” The voice even uttered one of Biden’s signature phrases: “What a bunch of malarkey.” There is currently no US federal law banning deepfakes. Some states have implemented laws regarding deepfake pornography, but their application is inconsistent across the country, making it difficult for victims to hold the creators to account. A proposed European Union AI Act would require platforms to label deepfakes as such.