Facebook’s under fire for allowing the posting of altered videos of Nancy Pelosi. One of those videos makes her appear drunk. For reasons that may not be clear at first, this is an existential crisis for Facebook.
First, let’s clear up some confusion. There are at least two Pelosi videos under discussion right now. One, from Fox Business, is a set of outtakes that show Pelosi hesitating and stuttering. Donald Trump actually retweeted it:
A second, as described in the New York Times, is slowed down to make Pelosi appear drunk.
On CNN, Facebook’s VP for Product Policy and Counterterrorism Monika Bickert told Anderson Cooper that the platform was marking the video as false according to fact checkers, but wouldn’t take it down.
Pelosi herself has called for Facebook to take the doctored video down.
So, what should Facebook do? That’s a far tougher question to answer than you might think.
The protean scope of “fake” video
Any idiot can now manipulate video — in any number of ways. So take a look at this list of possible altered videos. Imagine that these techniques were used to adulterate videos of your favorite political figure and one that you truly despise (say, Donald Trump and Barack Obama). Examine this list and tell me: which of these categories should Facebook ban, and which should it leave up and mark as manipulated?
- A 3-minute clip from a longer speech.
- A single 10-second clip from a speech, taken out of context.
- Series of 4-second clips from the same speech, taken out of context.
- A verbal stumble, repeated to ridicule the speaker.
- Video clips interspersed with other video to make a humorous parody.
- Video slowed down 25% to make the speaker appear drunk.
- Video slowed down 10% to make the speaker appear a bit slow.
- Video speeded up to double speed to make the speaker appear silly.
- Video recorded secretly when the speaker is unaware they’re being recorded.
- Clips of the subject being hit with a pie or milkshake.
- Video with the pitch altered — higher or lower.
- Video with the pitch and speed altered randomly to make the speaker appear to be deranged.
- Video that’s been color-manipulated or darkened to make the speaker appear nefarious.
- Video assembled from bit and pieces of speech to create an obvious fake or parody of the speaker saying something they never said.
- “Deepfakes” that use AI to show the speaker convincingly saying something they never said.
Obviously, this is not a complete list. Some of these are nasty and manipulative, others seem like appropriate journalistic techniques, and still others are clearly parody. But in any given situation, applied to any given speaker, some viewers might see them as unfair, misleading, or false.
So, which should Facebook ban?
There is no “automatic” way for Facebook to identify each of these and determine if it is unfair or misleading — AI might be able to determine which of the categories the video fits into but not whether that category was being used for legitimate or unfair purposes.
Why this is an existential crisis for Facebook
In contrast to the New York Times talking about Hope Hicks and her subpoena, when I say existential, I mean it: this question could threaten Facebook’s existence.
Consider Facebook’s choices.
It could do what it appears to be doing now: allow all sorts of manipulated video on the site, occasionally marking those videos as misleading or fake. If Facebook chooses this, it will descend into a cesspool of fakery and manipulation. It will become an unattractive place for people to spend time, and it will become irrelevant.
Alternatively, it could mark and delete these sorts of videos by using human judgment to determine if they’re inappropriate. In this formulation, the compilation of Pelosi clips would remain, but the slowed-down video would be flagged and taken down.
The problem with that choice is that it demands a lot of human labor — and every judgment will come under scrutiny, with critics calling Facebook’s choices as foul and partisan. It’s not clear Facebook could survive with the costs and labor required to police all that manipulated video.
It could ban video altogether, although that would make Facebook much more flat and boring.
Or it could end the public newsfeed, an option that appears to be very much on Mark Zuckerberg’s mind right now.
None of these options are very attractive. But Facebook as it stands now — subject to public ridicule no matter what choice it makes — is going to have a very hard time as fake video gets more prevalent.