Paper The following article is Open access

LiveEar: An Efficient and Easy-to-use Liveness Detection System for Voice Assistants

, , , and

Published under licence by IOP Publishing Ltd
, , Citation Ling Yue et al 2021 J. Phys.: Conf. Ser. 1871 012046 DOI 10.1088/1742-6596/1871/1/012046

1742-6596/1871/1/012046

Abstract

Voice assistants, such as Amazon Alexa, Apple Siri and Tmall Genie, using voice biometrics for the identity authentication, are becoming pervasive in our daily lives. However, voice assistants are vulnerable to reply attack due to the open nature of voice-input channels. An attacker can record the voice commands of victims and replay them to spoof voice assistants. Existing liveness detection approaches are mostly based on machine learning methods, which are expensive and complex. Recently, several approaches are proposed to leverage the human specific voice features or the distinctness of voice played by loudspeaker. However, they require the users and the voice assistant to be in a fixed position and at a very close distance, which is not user-friendly in practice. This paper proposes LiveEar, an efficient and easy-to-use liveness detection system for voice assistant. LiveEar utilizes the differences in phoneme positions between live-human voices and voices replayed through loudspeakers. Specifically, it calculates the time-difference-of-arrival (TDoA) in a sequence of phoneme sounds to the microphone on the voice assistant. Then, an SVM-based classification model is trained with the extracted TDoA features. This paper implements a prototype of LiveEar and evaluates its performance using real-world data. Results show that LiveEar achieves high detection accuracy in various flexible positions, with negligible runtime overhead.

Export citation and abstract BibTeX RIS

Content from this work may be used under the terms of the Creative Commons Attribution 3.0 licence. Any further distribution of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI.

Please wait… references are loading.
10.1088/1742-6596/1871/1/012046