Using inertial sensors to index into video
MetadataShow full item record
Video analysis is a very important tool that is used by players, coaches and sports scientists since it is more intuitive and familiar than other analysis methods. One of the issues with using video is searching the video data to look for specific events to examine. The common technique for searching the video data involves watching the video and hand scoring it for later use or by visually searching until the event of interest appears. There have been attempts to automatically score the data using image processing techniques but these require elaborate multicamera systems coupled with complex image processing software. Inertial sensors are cheap and readily available and have been used to great effect to analyse athlete's performance and to detect selected events in the performance. The advantage of inertial sensors is that they can be mounted on the person or sporting equipment and can continually monitor the performance without suffering the problems of lighting, angle, and occlusion which occur in video systems. This paper describes a technique to extract events using inertial sensors and using the timing of those events to index into synchronised video. The event type and timing information derived from the inertial sensor data can be stored and used as search keys for specific events in the video. This technique is demonstrated through a tennis visualisation system that uses strokes derived from a racquet mounted sensor as the index into the video.
© 2012 The Authors. Published by Elsevier Ltd. Open access under the Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0) License which permits unrestricted, non-commercial use, distribution and reproduction in any medium, providing that the work is properly cited. You may not alter, transform, or build upon this work.