Did you know Alarming Report Shows AI-Powered Voice Cloning Companies Don’t Have Enough Safeguards to Stop Fraud

Post Ads 1

Did you know Alarming Report Shows AI-Powered Voice Cloning Companies Don’t Have Enough Safeguards to Stop Fraud

 

Several AI voice cloning platforms are not implementing the right safeguards against fraud, an alarming report highlighted.  


Consumer Reports reveals that companies creating GenAI-based tools for voice cloning are not taking the right measures to stop scammers. As a result, they are defrauding clients, the report warned.

The study also shared how four out of six firms failed to provide the right services to build high barriers to stop people’s voices from getting cloned incorrectly. The report tested this voice cloning feature for Lovo, ElevenLabs, Speechify, and PlayHT, amongst others.


The report showed how easy it was to produce voice clones through audio available publicly using tools from these companies. All you needed to do was tick a check box which users had legal rights to produce voice clones. Two companies did have the right checklists in place, including restricted uploads of pre-recorded audio. These include Descript and Resemble AI.

Descript made sure users read the whole consent statement that was designed as a base to clone voices. As per Resemble AI, people had to carry out voice recording in real time to produce the highest quality for the clone. Attempts to produce the clone were limited to pre-recorded audio that were of low standards.

Still, the testers could easily bypass safeguards which brought to light the need for better safety policies. This includes calling upon the industry to set the right norm and standard to reduce risks of fraud.

GenAI tools ensured voice cloning was the right possibility and it was being used a lot for fraud. Scammers utilize all tools as the highest form of social engineering. They design audio of loved ones or friends in difficult situations and deceive others to get funds or attain sensitive details.


The tech can easily get access to bank details by bypassing voice ID verification too. As a result, the FBI was able to issue alerts about various voice cloning and video cloning schemes. This is used for all types of financial frauds. As per one bank based in the UK, Starling recommends using phrases to prove that the person on the other end is not actually a clone. They can also ensure greater safety by limiting access to social media apps.

Some companies like Synthesis produce life-like content by using video cloning and voice cloning of actual humans that saves on the costs linked to hiring production teams. Synthesis also has its own policy to stop AI abuse. It restricts any kind of cloning that’s non-consensual by adding biometric checks. So if the person putting out a request matches the person in the content, then only is it approved.

 


So as you can see, the company is using content moderation to the point of creating media depending on what policies are present to stop creating harmful content. Now the question is why others are not doing the same when the risk is so high.

Question Eleven Labs Speechify LOVO PlayHT Descript Resemble AI
What customer information is required before you can create a custom voice clone? First name, email address, credit card information Email Name, email Name, email Name, email Name, email, payment information
How much does it cost to create a custom voice clone? $5 $0 $0 $0 $0 $1
Are there technological barriers to frustrate non-consensual cloning? No; users presented with a checkbox to confirm they will abide by terms of service No; users enter full name after a self-certifying statement confirming they abide by the company's terms No; users presented with a checkbox to confirm they will abide by the company’s terms No; users presented with a checkbox to confirm they will abide by the company’s terms Yes; users must record or upload audio of an authorization statement that also trains the clone For the first voice clone, any audio works; for each subsequent voice clone, a consent statement must be recorded or uploaded
Does the privacy policy give the company permission to use customer voices to train or improve its model? Yes, with opt-out Yes, without offering an opt-out No Unclear Yes; no opt-out of voice model use for training, but projects can be opted out Yes, without offering an opt-out
Does the privacy policy permit the company to allow other customers or companies to use your voice model or data? Yes, if you opt in. Those who opt in are compensated. Yes, with user’s consent No Unclear; company can do anything with consent and for legitimate business interests Yes, with user’s consent No
Does the privacy policy grant users the right to delete all their voice data and other personal data? Yes, though possibly only in jurisdictions that offer data deletion rights Yes, though the company retains some data for “legitimate” business interests Yes, with minor exceptions Yes, with minor exceptions Yes, though the company retains some data for “legitimate” business interests Yes, though the company retains some data for “legitimate” business interests

Post Ads 2

Mohamed Elarby

A tech blog focused on blogging tips, SEO, social media, mobile gadgets, pc tips, how-to guides and general tips and tricks

Post a Comment

Previous Post Next Post