Plenty of warnings for scams with voice clones – but victims are scarce
A scammer who calls you with the voice of your mother or loved one and then asks to transfer money: a creepy thought. But no science fiction, according to an AD article of June last year with the head ‘Criminals will also present themselves by telephone as a friend’. The newspaper predicts that scam with voices cloned by AI will « set foot this year in the Netherlands ». Omroep WNL goes even further and reports that criminals in the Netherlands are now « striking in this way ». Almost all major media pay attention to the subject, including NRCAlthough reluctant: « What do we do if we can no longer trust the voice of our loved ones? »
In the summer of 2024, the central government even launches an awareness campaign. It has conducted a study that shows that many Dutch people cannot distinguish the real voice of Radio DJ Ruud de Wild from his voice clone and expected That this trick « will also be used in the Netherlands in the Netherlands by criminals. »
Nevertheless, after more than a year of alarming reporting, searching for people who are actually scammed with the voting clone of a well -known one. The Fraud Help Desk has so far received only two reports of scams in which according to the reporter there was a cloned voice. « And we were unable to verify that, » says spokesperson Tanya Wijngaarde. The Fraudhelpdesk is the most important reporting point for attempts for online scams. « If we don’t know about something, chances are that it won’t happen. » The fraud help desk, which was also very worried, is « surprised by itself » about the poor harvest.
Advanced technology
There is no doubt that digital scams are on the rise: the fraud help desk receives each month against the 60,000 reports of (especially online) fraud attempts. Experts are opposed NRC question marks with the attention that Voice Cloning gets there. According to Professor of Computer Vision and AI at the University of Amsterdam Theo Gevers, there are still major obstacles to inform someone by telephone with the voting clone of a family member or friend. The technology may be ‘quite advanced’, according to Gevers, conducting a live conversation through a voice clone is very laborious.
First about that technology, which is indeed advanced. The Elevenlabs program, the gold standard in the Voice Cloning world, only needs 10 seconds from someone’s voice to clon. The user can then type in what he wants to let the clone say. In Jargon this is called text-to-speech. « In fact, you can no longer recognize whether it is a real voice or not, » says Gevers. He also develops AI voice clones himself-but for positive purposes such as trauma processing for victims of crimes. « Stem clones are now also good at intonation and even emotion. »
But with that you are not yet there as a frauduler who wants to fool someone by telephone. He must be able to convert his own voice live into the clone. In other words: speech-to-speech. « The quality is lower than with text-to-speech, and there is a lot of delay for the answers. As we now talk to each other, the response time is between 0 and 0.1 seconds. Live AI voice clones are 0.5 to 2 seconds. »
That of course sounds unnatural, and people notice that, says Gevers. Did those breaks disappear through the rushing technique soon? Gevers wonders. « 0.1 seconds could be feasible, but then you need a very heavy computer chip, which is very expensive. » Another Bottleneck Is the language barrier. Stem clone technology is mainly trained with large languages such as English; According to Gevers, Dutch is not commercially attractive and therefore still underdeveloped. « Certainly speech-to-speech interviews are really not possible in Dutch. »
Customization
Moreover, scams with Voice Cloning requires a lot of customization. « You have to select a victim, find a loved one or a family member, find a voice recording of it, and then make a good voice clone, » says Boudewijn van der Valk, who leads the fraud expert team at ING. As far as he knows, this method is currently ‘not existing’ – although you can never exclude 100 percent that it happens. It is certain that it lives among people. « When I talk about my work at parties, this is one of the topics that are discussed. »
Cybrercrimepinentie University Lecturer at TU Delft Rolf van Wegberg has also seen Voice Cloning scams « not yet in the wild ». According to Van Wegberg, who also acts as a witness expert in court cases, the technology has « risk and potential, » but just like Van der Valk, he also calls it cumbersome and a lot of work. Scammers usually only put so much effort at large settings. « Then you know that it is about tons of ransom that you can understand. » With private individuals you do not know in advance ‘whether it will come out’ and look for scammers for more efficient methods.
That also says Wijngaarde of the Fraud Help Desk. She refers to phone calls where a strap is played, so -called robocalls. For example, the victim is asked to give access to the computer or internet banking. Such a tactic can be automatically released on countless people at the same time. The same applies to whatsapps of a child who is supposedly lost his phone and dad or ma from a loan telephone requires money. Such methods are so successful that scammers may not need more advanced tricks, says Wijngaarde.
Focus on anecdotes
The fact that the technology is there to clone votes is therefore not yet suitable for scams. Yet some stories are circulating in the media, especially from abroad. In many articles, for example, the same story comes back from an American woman who would have been called by a voice clone from her daughter who would have been abducted, after which the abductors demand ransom. Gevers is skeptical about such unforgivable stories. « I only believe it when the voice recording has been analyzed with voice tools, including the latest AI detectors. »
Gevers also points out that the conclusion that something is AI is sometimes drawn too quickly. He refers to an incident from 2021, with the House of Representatives with a deep -fake An employee of Alexei Navalny would have spoken. With a deepfake, not only the voice, but also the face of someone with AI is copied. But the employee turned out to be ‘just’ a cheater of flesh and blood.
According to experts, there is a dilemma in between wanting to warn something on time, and not wanting to be scared
The risk of attention to exciting scenarios such as Voice Cloning is that it is at the expense of the attention to methods that affect many more Dutch people, such as the aforementioned WhatsApp fraud.
Van Wegberg believes that the reporting on cyber crime « sometimes has a high Shownieuws content ». Journalists often ask him for a current example of cyber crime to which they can hang their article. « Then you quickly come to something that happened to happen. That is of course just an anecdote, not something representative. » Van der Valk, who is in principle happy that media attention for online scams has increased, refers in this context to reporting last year about hackers who would penetrate smartphones by copying your face. This story turned out not to be true later. « Something like that is unnecessary scare for things that we use in our daily lives. »
According to the experts, there is a dilemma in between wanting to warn something and not wanting to do a ranger. Manon den Dunnen recognizes this « very bad. » As a tech specialist with the police, she was unexpectedly called a lot by journalists about Voice Cloning Folking last year. She decides to see the subject as a coat rack to make people aware of how « you can no longer trust everything you hear and see. And that you should always verify a call to something irreversible such as transferring money, or sharing confidential information, with the sender. »
She tells the AD that she is very worried, and that there will be many more reports before the end of 2024. She asks colleagues to let her know if there is a report or declaration. But it remains with a handful of stories that she cannot verify. Her fear does not come true. « In any case, we don’t know. That is a learning point for me: you don’t know in advance. »
Whether she would do it differently next time, she finds a difficult question. « Because I still expect it to increase. » Where Gevers points to the shortcomings, such as the long response time and the for the time being inadequate Dutch, Den Dunnen emphasizes that people can also kick in an emotional fake phone if it doesn’t sound perfect. « People catch on the trusted that they belong. Certainly if you add distracting ambient sounds or pretend that the connection is bad. » According to her, this is evident from tests by the police.
Stories keep coming
Proof or not, the media attention for Voice Cloning scams continues. Recently it seemed a big problem again. The Fraud Help Desk had announced that it had received nearly 10,000 reports from Neptefons in the first quarter of this year. They were not all about Voice Cloning. Nevertheless, large media such as the AD and the NOS explicitly link it to that. News site NL Times Even writes that « thousands of Dutch people » are attacked with AI stem clones. Tanya Wijngaarde of the Fraud Help Desk confirms that the figures have been misinterpreted. « I was not aware in advance that our figures would be linked to stories about AI, » she says. But that association no longer appears to escape.