Manipulated White House Video Exposes Real Risks to Enterprises

On Nov. 7, CNN reporter Jim Acosta got into a heated discussion at a White House press conference with President Donald Trump over Russian election-meddling investigations. After some memorable name-calling from the president and verbal pushback from Acosta, there was a brief struggle for the microphone when a White House intern attempted to retrieve it from Acosta to pass to the next questioner.

Hours later, the White House announced that Acosta’s press credentials had been revoked and White House Press Secretary Sarah Sanders tweeted a video she said demonstrated the reporter had physically pushed away the intern, stating this was the reason for the revocation.

The problem? While Sanders’ video did indeed show Acosta apparently striking the intern, the video does not quite match the readily available original videos of the event; it appears Sanders’ version of the video had been altered.

While the alteration in this case was noticeable and crudely done, today’s advanced video manipulation tools make much better alterations possible. Some, called “deep fakes,” use a combination of video-editing tools and AI to learn a subject’s actions and then use existing photographs and videos to make convincing fakes that can have the subject doing or saying almost anything.

Video manipulation today can provide some real risks to the public view of reality and to your company. The first risk is that the public will eventually stop believing video images of anything, and video will no longer be considered evidence of reality. But greater harm can come to your organization by those who want to cause harm or simply to achieve some advantage over you.

Suppose a video were to surface on social media appearing to show your CEO making a racist or misogynistic statement in a meeting? Or suppose a video appeared that seemed to show a blatant case of sexual misconduct at a company function?

To avoid these risks, save video evidence of the public actions of your organization’s senior executives so that there is actual, verifiable video that you can show hasn’t been altered. Ensure these videos include timing information, full metadata and, where possible, GPS data. Also preserve copies of such videos where they can be searched so that the event can be shown in an original form. That way, if someone uses the event to create a fake video, you have the original to prove it.