At a time when the term "Fake news" has become a household name thanks to its repeated use by President Donald Trump, deepfakes - i.e., seemingly realistic videos that are in fact manipulated - can further escalate the problem associated with distrust of media.
If the algorithm of a deepfake has enough data of an existing subject, someone else can use the tech to manipulate the video and make it look like the subject is saying or doing pretty much anything.
There is an anonymous Reddit account that became infamous for creating fake AI-assisted videos of celebrities, which are often pornographic.
Although the creator's subreddit was banned in February 2018, its videos remain in the public domain.
Amid the threat of fake videos delegitimizing genuine recordings, Amber is building a middle layer to detect malicious alterations and has developed both detection and authentication technology.
A deepfake video with the very same signature as the victim's camera is highly unlikely, signifying that one can prove which video was recorded by the camera and which one was not.
On Oct. 3, 2019, Axon Enterprise Inc., a tech manufacturer for U.S. law enforcement, announced that it is exploring new data-tracking technology for its body cameras and will rely on blockchain technology to verify the authenticity of police body cam videos.
The Media Forensics program of the Defense Advanced Research Projects Agency, commonly known as DARPA, is developing "Technologies for the automated assessment of the integrity of an image or video." To help prove video alterations, Factom Protocol has come up with a solution called Off-Blocks.
"At a time of heightened scrutiny around the veracity of news, content, and documentation, the rise of deepfake technology poses a significant threat to our society. As this phenomenon becomes more pronounced and accessible, we could arrive at a situation whereby the authenticity of a wide array of video content will be challenged. This is a dangerous development that blurs the line around digital identity - something that should be upheld with the most rigorous security measures."
"Take an example of Nuclear energy. It can be used to power the homes of millions of people. When in the wrong hands, it could even be used to kill millions. Technology by themselves don't have any moral code, but humans do. Deepfakes can be used to make entertaining applications that can soon be on your mobile phones. But the same applications can ruin lives and the fabric of society if used by malicious actors."
As Deepfake Videos Spread, Blockchain Can Be Used to Stop Them
gepubliceerd op Oct 9, 2019
by Cointele | gepubliceerd op Coinage
Coinage
Recent nieuws
Alles zien
Blockchain Bites: Bitcoin's Run, Uniswap's Hemorrhaging Value, Anchorage's Banking Bid
Bitcoin is nearing all-time highs in price and market cap last set three years ago.
Japan's megabanks to lead experiment with digital yen
We have, in order, Cheese Bank with a $3.3 million theft, Akropolis with its $2 million loss, Value DeFi with a whopping $6 million exploit and finally Origin Protocol's loss of $7 million.
Number of new Bitcoin addresses spikes amid growing FOMO
Japan's three largest banks, as part of a group of 30 private sector actors, are set to collaborate on an experiment with a digital yen.
Not just Wall Street: Quant trader explains why Bitcoin price is going up
Sam Trabucco, a quantitative trader at Alameda Research, believes four general factors are pushing up the price of Bitcoin.