[ad_1]
Stephen Henry thought, “It has to be perfect.” Otherwise, how can he win the prime ministership? ”
Article content
A Toronto man says he lost $12,000 after falling for a deepfake cryptocurrency scam that used Justin Trudeau’s likeness to endorse a fraudulent investment platform.
The scam was spread through YouTube videos and was manipulated with AI and voice cloning technology to appear as if Prime Minister Trudeau was promoting a cryptocurrency exchange and investment platform aimed at “helping Canadians secure their economic future.” pretended to be doing it.
Advertisement 2
Article content
Article content
“I thought, ‘It’s got to be legitimate, it’s got to be perfect. Otherwise, how can we get prime minister?’ So I said, ‘This has to be the official It has to be something,”’ Stephen Henry told CTV.
Henry initially invested $250 and then continued to invest his savings believing that the value of his investment had grown to more than $40,000.
Recommendations from the editorial department
-
Deepfake Prime Minister Trudeau sells virtual currency with accents, worries about future of AI disinformation
-
Canada’s cybersecurity is under siege and the government is powerless
When Henry attempts to withdraw some of his money and fails, he realizes he has been scammed.
“Now I’m stripped of any possibility of making a living. That was all the money I had,” he said.
Henry is never alone. As the quality and accessibility of deepfake technology improves, scams that exploit images of politicians and celebrities to deceive individuals are on the rise.
Taylor Swift, Pope Francis, and Ukrainian President Volodymyr Zelenskiy are just a few examples of people whose likenesses have been used in deepfake scams and misinformation campaigns.
Fraudsters manipulate AI and voice cloning technology to create highly convincing yet fraudulent recommendations. AI and machine learning algorithms can mimic voices, including overlaying faces and recreating mannerisms and vocal patterns.
Even less believable ads can be effective, especially for people who are new to advances in AI technology.
Facebook users may have recently spotted an ad on the platform featuring a deepfake Justin Trudeau promoting a cryptocurrency scam.
The fake ad uses footage of a CBC interview in which Trudeau speaks with an Australian accent.
Article content
Advertisement 3
Article content
Aengus Bridgman, an assistant professor at McGill University, said: “The characteristic of a scam is that it needs to be real enough to catch someone, but at the same time be plausible enough for the person caught to get away with it.” It has to be fake enough,” Aengus Bridgman, an assistant professor at McGill University, told the National Post. Month.
Bridgman said that while Trudeau’s ad was poorly done, it also served the purpose of weeding out experienced users in order to bring in people who were more likely to invest their money in a scam.
“These are the types of people we want to capture with these ads: people who are not digitally savvy, just like seniors in Canada are falling prey to phone scams and identity theft,” Bridgman said.
In a statement to CTV, the Prime Minister’s Office acknowledged the challenges posed by deepfake technology and the spread of disinformation targeting elected officials.
“The amount of deceptive, false and misleading information and accounts targeting elected officials is becoming increasingly concerning and unacceptable, especially in the age of deepfake technology.” he said.
Advertisement 4
Article content
As the federal government scrambles to keep up with advances in technology, it says educating the community and fostering critical engagement with information is a key strategy for protection.
“Social norms and discourse around deepfakes work to create a social environment where people are not only more skeptical of what they see, but are also encouraged to challenge each other’s information claims. The Canadian Security Intelligence Service (CSIS) points out.
Some technology companies and social media platforms are using a combination of human insights and automated methods to detect deepfakes, while also holding deepfakes creators and distributors accountable and honoring them. There are also movements calling for a legal framework to protect victims of defamation.
“Thought leaders and key figures in social networks are key to changing social norms,” CSIS added. “Educational resources, including digital literacy training, are a helpful tool, especially for influencers. Videos explaining political deepfakes can reduce uncertainty, thereby increasing trust in media. I know, but norms only really change through collective action.”
Our website is the place to go for the latest breaking news, exclusive scoops, long-form articles and provocative commentary. Bookmark Nationalpost.com and sign up for our daily Posted newsletter here.
Article content
[ad_2]
Source link