Deceptively edited videos altered to make Speaker Nancy Pelosi (D-Calif.) appear to be drunkenly slurring her words are spreading across social media despite attempts by platforms to halt their dissemination, according to The Washington Post.
The videos alter the audio of Pelosi’s speech at a Center for American Progress event Wednesday in which she accuses President Trump of a “cover-up,” editing the clip to make it appear as though she is slurring her words.
The video has been viewed through the Facebook page Politics WatchDog more than 1.3 million times and been shared more than 32,000 times, according to the Post.
{mosads}The specific edits appear to have involved slowing the audio to about 75 percent speed while modifying her pitch to correct for distortions to her voice from slowing down the video, according to an analysis by the Post.
The video has been linked by multiple YouTube and Twitter accounts as well as comments sections for local news outlets.
A spokesperson for YouTube told The Hill the video had been removed from the platform for company policy violations.
“YouTube has clear policies that outline what content is not acceptable to post and we remove videos violating these policies when flagged to us. These videos violated our policies and have been removed,” the YouTube spokesperson said.
The spokesperson also denied the videos had been displayed prominently.
“They also did not surface prominently. In fact, search results and watch next panels about Nancy Pelosi include videos from authoritative sources, usually at the top,” the representative said.
The video has surfaced as cybersecurity experts are increasingly concerned about the potential use of video editing to spread misinformation, particularly computer-altered “deep fake” videos, which Fabrice Pothier, senior adviser with the Transatlantic Commission on Election Integrity, called “the next weapon in the disinformation warfare” in January.
The video of Pelosi, by contrast, “shows that there is a larger threat of misinformation campaigns — too many of us are willing to believe the worst in people that we disagree with,” Hany Farid, a computer science professor and digital-forensics expert at the University of California, Berkeley, told the Post. “It is striking that such a simple manipulation can be so effective and believable to some.”
— Updated at 6:19 p.m.