A deepfake image of Hollywood actor Tom Hanks that "easily passes as real" was made for less than $100 as part of an experiment.
The image, made using machine-learning algorithms, was released this week to demonstrate the harrowing affects of deepfakes.
Philip Tully, a data scientist, made the image for less than $100 to test how easily it could be made from software used in artificial intelligence (AI) labs.
He said: "People with not a lot of experience can take these machine-learning models and do pretty powerful things with them."
Tully further warned how these labs could be used for disinformation campaigns.
In a Wired report, it was deemed that the Hanks image could "easily pass as real".
It read: "There are many photos of Tom Hanks, but none like the images of the leading everyman shown at the Black Hat computer security conference Wednesday: They were made by machine-learning algorithms, not a camera."
The image was made by gathering hundreds of images of Hanks before using software to produce the image.
Earlier this week, experts warned deepfakes could become even more dangerous thanks to the coronavirus pandemic.
Due to lockdown, TV interviews and talk shows have been filmed from people's living rooms at a reduced quality.
Robot burglars and 'Tesla truck' bombs will be part of AI crime wave says expert
Peter Singer, a cybersecurity strategist, believes the boom of low-quality, self-filmed footage will make deepfake technology harder to stop.
He said: “The quality bar does not need to be exceedingly high when it comes to synthetic generations.
"It only needs to be 'good enough' for even just a subset of vocal users to not question it in a world characterized by rapid, high-volume information consumption."
Deepfake technology is fabricated material that shows a person saying something they never even said.
Ex-US President Barack Obama and his successor Donald Trump have both been victims, with videos showing them saying things they did not say.
It is feared they could wreak havoc on the next US election.
Source: Read Full Article