MULTISENSORY GUIDANCE OF GOAL-ORIENTED BEHAVIOUR OF LEGGED ROBOTS
Biological systems often combine cues from two different sensory modalities to execute goal-oriented sensorimotor tasks, which otherwise cannot be accurately executed with either sensory stream in isolation. When auditory cues alone are not sufficient to accurately localise an audio-visual target by orienting towards it, visual cues can complement their auditory counterparts and improve localisation accuracy. We present a multisensory goal-oriented locomotion control architecture that uses visual feedback to adaptively improve acoustomotor orientation response of the hexapod robot AMOS II. The robot is tasked with localising an audio-visual target by turning towards it. The architecture extracts sound direction information with a model of the peripheral auditory system of lizards to modulate locomotion control parameters driving the turning behaviour. The visual information adaptively changes the strength of the acoustomotor coupling to adjust turning speed of the robot. Our experiments demonstrate improved orientation towards the audio-visual target emitting a tone of frequency 2.2 kHz located at an angular offset of 45° from the robot.