When an edited video of House Speaker Nancy Pelosi, D-Calif., began spreading across the Web this week, researchers quickly identified it as a distortion, with sound and playback speed that had been manipulated to make her speech appear stilted and slurred.But in the hours after the social-media giants were alerted, Facebook, Twitter and YouTube offered widely conflicting responses that potentially allowed the viral misinformation to continue its spread.YouTube offered a definitive response Thursday afternoon, saying the company had removed the videos because they violated “clear policies that outline what content is not acceptable to post.”Twitter declined to comment.But sharing the video would likely not conflict with the company’s policies, which permit “inaccurate statements about an elected official” as long as they don’t include efforts of election manipulation or voter suppression. Several tweets sharing the video, often alongside insults that Pelosi was “drunk as (a) skunk,” remained online Friday.But Facebook, where the video appeared to gain much of its audience, declined Friday to remove the video, even after Facebook’s independent fact-checking groups, Lead Stories and PolitiFact, deemed the video “false.”“We don’t have a policy that stipulates that the information you post on Facebook must be true,” Facebook said in a statement to The Washington Post.
U.S. President Donald Trump shakes hands with Speaker of the U.S. House of Representatives Nancy Pelosi during the State of the Union address at the U.S. Capitol in Washington, D.C., on February 5, 2019.
SAUL LOEB/AFP/Getty Images
The company said it instead would “heavily reduce” the video’s appearances in people’s News Feeds, append a small informational box alongside the video linking to the two fact-check sites, and open a pop-up box linking to “additional reporting” whenever someone clicks to share the video.That didn’t satisfy lawmakers such as Rep. David N. Cicilline, a Rhode Island Democrat, who took to Twitter to demand that Facebook “fix this now!”U.S. Senator Brian Schatz, D-Hawaii, tweeted: “Facebook is very responsive to my office when I want to talk about federal legislation and suddenly get marbles in their mouths when we ask them about dealing with a fake video. It’s not that they cannot solve this; it’s that they refuse to do what is necessary.”While Facebook’s actions might provide context and lower the rate at which people will happen upon the video while browsing the social network, they did virtually nothing to prevent the false video’s spread by people who have already seen it: Any user could still like, comment, view and share the video as often as they liked.We believe that reducing the distribution of inauthentic content strikes that balanceFacebook
In the 24 hours after The Washington Post alerted Facebook to the video, its viewership on a single Facebook page had nearly doubled, to more than 2.5 million views. The video had also been reposted on to other Facebook pages, where its audience was growing even further.“Just for the record we never claimed that Speaker Pelosi was drunk,” Politics WatchDog wrote in a May 23 Facebook post. “We can’t control what the people in the comments think. It’s a free country. For your information we are not a conservative news outlet. Washington Post is fake news!”The conflicting responses reveal a key vulnerability in how the Internet giants safeguard against viral lies and blatant falsehoods. The companies run some of the country’s most prominent and powerful sources of information, including for understanding political campaigns in the months heading into the 2020 presidential election. But they have shown little ability – and, in Facebook’s case, interest – in limiting the spread of falsehoods.Facebook has resisted removing outright false information by citing free-speech concerns, a stand the company reiterated Friday.“There’s a tension here: we work hard to find the right balance between encouraging free expression and promoting a safe and authentic community, and we believe that reducing the distribution of inauthentic content strikes that balance,” Facebook said in a statement.But Jason Kint, the chief executive of Digital Content Next, a trade group representing online publishers, said Facebook should take a more active role in policing and slowing the spread of misinformation.The site, he said, is reluctant to give too much power to fact-checkers or content moderators, and many pieces of content can often lapse into gray areas, where people’s perceptions of the material depend on their personal politics.But with clearer distortions like the Pelosi video, the company should respond more quickly and decisively to potentially stifle the disinformation before it gains a life of its own.“When they put it into people’s timelines and give it velocity and reach that it doesn’t deserve, they’re helping to spread it,” he said.President Donald Trump on Thursday night tweeted a separate video taken from the Fox Business Network: a selectively edited 30-second clip focused on her pauses and verbal stumbles from a 20-minute official briefing earlier that day.The videos fed into what Pelosi’s defenders have called sexist and conspiratorial portrayals of the health of America’s highest-ranking elected woman. They also resemble political videos that posed similar questions about Hillary Clinton’s fitness during the 2016 campaign.Pelosi tweeted Thursday night that Trump was “distracting from House Democrats’ great accomplishments #ForThePeople, from his cover-ups, and unpopularity.”