This site uses cookies. By continuing to use this site you agree to our use of cookies. To find out more, see our Privacy and Cookies policy.
Paper The following article is Open access

The fuzzy-based systems in the communication between a human and a humanoid robot

Published under licence by IOP Publishing Ltd
, , Citation E Mogos 2022 J. Phys.: Conf. Ser. 2251 012003 DOI 10.1088/1742-6596/2251/1/012003

1742-6596/2251/1/012003

Abstract

The communication between a human and a humanoid robot is a real challenge for the researchers in the field of the robotics. Despite of the progress in the acoustic modelling and in the natural languages the humanoid robots are overtaken by the humans when the humanoid robots are engaged in the real life because the speech and the human emotions are extremely ambiguous due to the noises and the external audio events from the robot's environment. The humans assign a correct interpretation to the perceived ambiguous signal, but the humanoids robots cannot interpret the ambiguous signal. The most common software used in the interpretation of the ambiguous signal is a fuzzy based software. The artificial neuro-fuzzy inference system, shortly known as ANFIS is the emotion recognition system based on the fuzzy sets which acts as the thalamus of the human brain and it is responsible for the sensorial perception of the humanoid robot. Our goal in this work is to create the fuzzy-based sound signals software and the fuzzy-based genetic algorithm with high performance in the communication between the human and the humanoid robots which help the humanoid robots to think, to understand the human speech and the human emotions and all the ambiguous signals from the robot's environment in a way that it is distinguishable for every humanoid robot as the human.

Export citation and abstract BibTeX RIS

Content from this work may be used under the terms of the Creative Commons Attribution 3.0 licence. Any further distribution of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI.

Please wait… references are loading.