As Deepfake Videos Spread, Blockchain Can Be Used to Stop Them

gepubliceerd op by Cointele | gepubliceerd op

At a time when the term "Fake news" has become a household name thanks to its repeated use by President Donald Trump, deepfakes - i.e., seemingly realistic videos that are in fact manipulated - can further escalate the problem associated with distrust of media.

If the algorithm of a deepfake has enough data of an existing subject, someone else can use the tech to manipulate the video and make it look like the subject is saying or doing pretty much anything.

There is an anonymous Reddit account that became infamous for creating fake AI-assisted videos of celebrities, which are often pornographic.

Although the creator's subreddit was banned in February 2018, its videos remain in the public domain.

Amid the threat of fake videos delegitimizing genuine recordings, Amber is building a middle layer to detect malicious alterations and has developed both detection and authentication technology.

A deepfake video with the very same signature as the victim's camera is highly unlikely, signifying that one can prove which video was recorded by the camera and which one was not.

On Oct. 3, 2019, Axon Enterprise Inc., a tech manufacturer for U.S. law enforcement, announced that it is exploring new data-tracking technology for its body cameras and will rely on blockchain technology to verify the authenticity of police body cam videos.

The Media Forensics program of the Defense Advanced Research Projects Agency, commonly known as DARPA, is developing "Technologies for the automated assessment of the integrity of an image or video." To help prove video alterations, Factom Protocol has come up with a solution called Off-Blocks.

"At a time of heightened scrutiny around the veracity of news, content, and documentation, the rise of deepfake technology poses a significant threat to our society. As this phenomenon becomes more pronounced and accessible, we could arrive at a situation whereby the authenticity of a wide array of video content will be challenged. This is a dangerous development that blurs the line around digital identity - something that should be upheld with the most rigorous security measures."

"Take an example of Nuclear energy. It can be used to power the homes of millions of people. When in the wrong hands, it could even be used to kill millions. Technology by themselves don't have any moral code, but humans do. Deepfakes can be used to make entertaining applications that can soon be on your mobile phones. But the same applications can ruin lives and the fabric of society if used by malicious actors."

x