- Elon Musk reposted a deepfake video of Kamala Harris on X on Friday night.
- The video is a parody of one of Harris' election ads and appears to have been digitally altered.
- Musk's repost lacks context and may violate X's rules on synthetic and manipulated media.
On Friday night, Elon Musk reposted a deepfake video of Vice President Kamala Harris on X, a move that The New York Times reported may violate Musk's own platform's policy on synthetic and manipulated media.
The video was originally posted by user @MrReaganUSA, who noted that it was a “parody” of Harris' first campaign ad since becoming the Democratic nominee for 2024 presidential election.
The clip appears to have been digitally altered to include new narration that sounds like Harris.
“I was chosen because I am the ultimate diversity hire. I am a woman and a person of color, so anyone who criticizes what I have to say is sexist and racist,” an edited voiceover narrator says in the video.
The deceptive narration also claims Biden has dementia and that Harris and Biden are puppets of the “deep state.”
Musk reposted the video, which has been viewed more than 117 million times, without mentioning that it had been edited, simply writing, “This is awesome 😂.”
And that could violate X's policy on synthetic and manipulated media, which states, “You may not share synthetic, manipulated, or out-of-context media that may deceive or confuse people and cause harm ('misleading media').”
X said that for the company to take action and remove or label posts that violate its policies, they must contain “media that is significantly deceptively altered,” “media that is shared in a deceptive manner or in a false context,” or media that could cause “widespread confusion about a public issue.”
The company said it would consider factors such as “whether there has been any added, edited or removed visual or auditory information (such as new video frames, dubbed audio, altered subtitles, etc.) that fundamentally changes the understanding, meaning or context of the media.”
Deepfake boom
Deepfakes use artificial intelligence to replace the likeness of a person in video or audio footage with that of another person.
Research has found that audio deepfakes are relatively easy to create but difficult to detect.
A number of politicians have fallen victim to the technology in the past, highlighting its potential to wreak havoc during election season.
A video that went viral on social media last year showed Hillary Clinton unexpectedly endorsing Florida Governor Ron DeSantis, but Reuters reported that the video was generated by an AI.
Biden was also a victim of deepfakes after he announced he was dropping out of the 2024 presidential race.
A video posted to social media appeared to show the president attacking and hurling abuse at his critics, but AFP reported that the footage was also a deepfake.