Low-rank hypergraph feature selection for multi-output regression
Author(s)
Zhu, Xiaofeng
Hu, Rongyao
Lei, Cong
Thung, Kim Han
Zheng, Wei
Wang, Can
Griffith University Author(s)
Year published
2019
Metadata
Show full item recordAbstract
Current multi-output regression method usually ignores the relationship among response variables, and thus it is challenging to obtain an effective coefficient matrix for predicting the response variables with the features. We address these problems by proposing a novel multi-output regression method, which combines sparse feature selection and low-rank linear regression in a unified framework. Specifically, we first utilize a hypergraph Laplacian regularization term to preserve the high-order structure among all the samples, and then use a low-rank constraint to respectively discover the hidden structure among the response ...
View more >Current multi-output regression method usually ignores the relationship among response variables, and thus it is challenging to obtain an effective coefficient matrix for predicting the response variables with the features. We address these problems by proposing a novel multi-output regression method, which combines sparse feature selection and low-rank linear regression in a unified framework. Specifically, we first utilize a hypergraph Laplacian regularization term to preserve the high-order structure among all the samples, and then use a low-rank constraint to respectively discover the hidden structure among the response variables and explore the relationship among different features in a least square regression framework. As a result, we integrate subspace learning with sparse feature selection to select useful features for multi-output regression. We tested our proposed method using several public data sets, and the experimental results showed that our method outperformed other comparison methods.
View less >
View more >Current multi-output regression method usually ignores the relationship among response variables, and thus it is challenging to obtain an effective coefficient matrix for predicting the response variables with the features. We address these problems by proposing a novel multi-output regression method, which combines sparse feature selection and low-rank linear regression in a unified framework. Specifically, we first utilize a hypergraph Laplacian regularization term to preserve the high-order structure among all the samples, and then use a low-rank constraint to respectively discover the hidden structure among the response variables and explore the relationship among different features in a least square regression framework. As a result, we integrate subspace learning with sparse feature selection to select useful features for multi-output regression. We tested our proposed method using several public data sets, and the experimental results showed that our method outperformed other comparison methods.
View less >
Journal Title
World Wide Web
Note
This publication has been entered into Griffith Research Online as an Advanced Online Version.
Subject
Data management and data science
Distributed computing and systems software
Distributed computing and systems software not elsewhere classified
Information systems