Multisensory guidance of goal-oriented behaviour of legged robots
View/ Open
Author(s)
Shaikh, D
Manoonpong, P
Tuxworth, G
Bodenhagen, L
Griffith University Author(s)
Year published
2018
Metadata
Show full item recordAbstract
Biological systems often combine cues from two different sensory modalities to execute goal-oriented sensorimotor tasks, which otherwise cannot be accurately executed with either sensory stream in isolation. When auditory cues alone are not sufficient to accurately localise an audio-visual target by orienting towards it, visual cues can complement their auditory counterparts and improve localisation accuracy. We present a multisensory goal-oriented locomotion control architecture that uses visual feedback to adaptively improve acoustomotor orientation response of the hexapod robot AMOS II. The robot is tasked with localising ...
View more >Biological systems often combine cues from two different sensory modalities to execute goal-oriented sensorimotor tasks, which otherwise cannot be accurately executed with either sensory stream in isolation. When auditory cues alone are not sufficient to accurately localise an audio-visual target by orienting towards it, visual cues can complement their auditory counterparts and improve localisation accuracy. We present a multisensory goal-oriented locomotion control architecture that uses visual feedback to adaptively improve acoustomotor orientation response of the hexapod robot AMOS II. The robot is tasked with localising an audio-visual target by turning towards it. The architecture extracts sound direction information with a model of the peripheral auditory system of lizards to modulate locomotion control parameters driving the turning behaviour. The visual information adaptively changes the strength of the acoustomotor coupling to adjust turning speed of the robot. Our experiments demonstrate improved orientation towards the audio-visual target emitting a tone of frequency 2.2 kHz located at an angular offset of 45° from the robot.
View less >
View more >Biological systems often combine cues from two different sensory modalities to execute goal-oriented sensorimotor tasks, which otherwise cannot be accurately executed with either sensory stream in isolation. When auditory cues alone are not sufficient to accurately localise an audio-visual target by orienting towards it, visual cues can complement their auditory counterparts and improve localisation accuracy. We present a multisensory goal-oriented locomotion control architecture that uses visual feedback to adaptively improve acoustomotor orientation response of the hexapod robot AMOS II. The robot is tasked with localising an audio-visual target by turning towards it. The architecture extracts sound direction information with a model of the peripheral auditory system of lizards to modulate locomotion control parameters driving the turning behaviour. The visual information adaptively changes the strength of the acoustomotor coupling to adjust turning speed of the robot. Our experiments demonstrate improved orientation towards the audio-visual target emitting a tone of frequency 2.2 kHz located at an angular offset of 45° from the robot.
View less >
Conference Title
Human-Centric Robotics- Proceedings of the 20th International Conference on Climbing and Walking Robots and the Support Technologies for Mobile Machines, CLAWAR 2017
Copyright Statement
© 2017World Scientific Publishing. This is the author-manuscript version of this paper. Reproduced in accordance with the copyright policy of the publisher. Please refer to the conference's website for access to the definitive, published version.
Subject
Autonomous agents and multiagent systems
Intelligent robotics
Social robotics
Reinforcement learning