Towards Designing More Secure Speaker Verification Systems

Source: Deep Learning on Medium


With increasing digital growth, a lot of people are publishing personal data on the Internet including voice samples, video clips and facial images which contain biometric traits that are unique identifiers.

Biometric technology, particularly the use of voice for strong user authentication is progressively being baked into security systems and modern consumer devices to regulate access to a restricted domain.

But voice technology is susceptible to malicious attacks and can be easily fooled by attackers. Unprotected automatic speaker verification (ASV) systems can be easily spoofed using replay, voice conversion (VC) and text-to-speech (TTS) attacks. But there is more — research shows that mimicry attack can also prove a serious threat for automatic speaker verification.

Voice Mimicry Attacks Assisted by Automatic Speaker Verification

Recently, a group of researchers studied an overlooked ASV attack — mimicry which involves human-based voice modification. In the study, the researchers carried out an experimental assessment of attacks on voice biometric system with the help of audio data. They looked for speakers with the most similar voices and used them as imposters. Audio data was gathered from publicly available voice data and the ASV tool used for voice similarity search.

Comparison of attackers’ ASV scores (log likelihood ratios) to the targets’ scores for both of the ASV systems involved in the study. The scores are averaged over all attackers and all speech segments. The error bars represent 95 % confidence intervals for the means.

Potential Uses and Effects

The world is increasingly becoming digital. Tasks are becoming more automated with less and less human-to-human communication. This calls for more effective ways to protect digital data from unauthorized access.

This research work demonstrates that mimicry falls in the category the list of potential attacks against ASV.

For people whose voice data is wide-open in the public domain, they are an easy target for mimicry attacks. And although it is currently not possible to search for biometric data using today’s technology, it soon could be possible in the near future.

Personally, I think studying mimicry attacks can help the machine learning community design better and secure ASV algorithms.

Read more: https://arxiv.org/abs/1906.01454

Thanks for reading. Please comment, share and remember to subscribe to our weekly newsletter for the most recent and interesting research papers! You can also follow me on Twitter and LinkedIn. Remember to 👏 if you enjoyed this article. Cheers!