News
Google Isn’t Removing Melania Trump “Deepfake” Porn & The Reason Why Is Confusing
In what is rapidly becoming a pressing ethical issue, a Vice report noted that Google hasn't deleted "deepfake" porn videos of Melania Trump. Vice first published the report on Monday, noting that a preliminary Google search with the first lady's name plus "deepfake" in the search field will yield fake porn videos of Trump in numerous entries. At press time, Google had not responded to Bustle's request for comment.
In one of the clips, the first lady's face is superimposed on an adult film performer's face as the actress masturbates in front of the camera. The aforementioned fake clip was first uploaded two months ago and appeared on the far-right community website and social network called Voat. But it isn't just the first lady whose face has been appropriated by fake porn creators: The face of Donald Trump's daughter and adviser Ivanka Trump has also been used in fabricated porn videos, according to Vice.
In January, Vice reported on an AI-based app that gave users the tools to create fake porn of pretty much anyone they wanted. It was open-source, meaning that the app was for the public to use. Vice reported that a user named "deepfakes" would swap the faces of celebrities onto adult film actresses in porn clips and then upload the videos on the now-banned "deepfakes" subreddit.
While Reddit promptly banned the subreddit, internet giant Google has yet to do anything about these user-generated videos, according to Vice. The publication noted that the company's approach toward requests for removing such content from Google was "extremely cautious."
In its web page titled "Legal Removal Requests," Google says that it will "carefully review the material" and then "consider blocking, removing, or restricting access to it." When it comes to "abusive" content, Google says, "Abusive content on Google's services may also violate Google's product policies, so before sending us a legal request, consider flagging the post, image, or video for one of our content teams to review."
Vice spoke with Hany Farid, chairman of Dartmouth College's computer science department, who said that the problem of "deepfakes" goes beyond porn. "Any time now a politician is caught saying something inappropriate, illegal, or offensive, they have plausible deniability, they are now going to say this content is fake," Farid told Vice. Farid, who is an image forensics expert, has written about detecting fake videos by observing the blood flow in a person's face as well as how natural light appears in such videos.
To demonstrate the rapidly advancing nature of AI-generated fake videos, comedian and film director Jordan Peele teamed up with Monkeypaw Productions and BuzzFeed to show a believably-real former President Barack Obama call Trump a "dipsh*t" and tell people to "stay woke." It's an absolutely fake video but Peele's remarkable imitation of Obama's voice — including an "uh" in the middle — makes the short clip highly convincing.
University of Texas law professor and former White House adviser Bobby Chesney told Vice that the creators of such "deepfakes" could legally argue that their videos constitute "art." That's where "it gets tricky," Chesney said. "If I'm a bad actor, when I’m called to account for a damaging 'deepfake,' I would certainly claim it is satire."
But victims of such videos could be able to hold "deepfake" creators accountable. In January, Bennet Kelley, who created the Internet Law Center, told Bustle that although he didn't "think [present] legislation contemplates" the dangers of these videos, he said that "you could argue that it's the same harm [as revenge porn]."
Observers have already noted that although these fake videos are being scrubbed from the pages of Twitter, PornHub, and Reddit, they remain visible in Google searches.