Material Based Object Tracking in Hyperspectral Videos
File version
Author(s)
Zhou, Jun
Qian, Yuntao
Griffith University Author(s)
Primary Supervisor
Other Supervisors
Editor(s)
Date
Size
File type(s)
Location
License
Abstract
Traditional color images only depict color intensities in red, green and blue channels, often making object trackers fail in challenging scenarios, e.g., background clutter and rapid changes of target appearance. Alternatively, material information of targets contained in large amount of bands of hyperspectral images (HSI) is more robust to these difficult conditions. In this paper, we conduct a comprehensive study on how material information can be utilized to boost object tracking from three aspects: dataset, material feature representation and material based tracking. In terms of dataset, we construct a dataset of fully-annotated videos, which contain both hyperspectral and color sequences of the same scene. Material information is represented by spectral-spatial histogram of multidimensional gradients, which describes the 3D local spectral-spatial structure in an HSI, and fractional abundances of constituted material components which encode the underlying material distribution. These two types of features are embedded into correlation filters, yielding material based tracking. Experimental results on the collected dataset show the potentials and advantages of material based object tracking.
Journal Title
IEEE Transactions on Image Processing
Conference Title
Book Title
Edition
Volume
29
Issue
Thesis Type
Degree Program
School
Publisher link
Patent number
Funder(s)
Grant identifier(s)
Rights Statement
Rights Statement
Item Access Status
Note
Access the data
Related item(s)
Subject
Artificial intelligence
Cognitive and computational psychology
Science & Technology
Computer Science, Artificial Intelligence
Engineering, Electrical & Electronic
Computer Science
Persistent link to this record
Citation
Xiong, F; Zhou, J; Qian, Y, Material Based Object Tracking in Hyperspectral Videos, IEEE Transactions on Image Processing, 2020, 29, pp. 3719-3733