Linear Models for Dimensionality Reduction and Statistical Pattern Recognition for Supervised and Unsupervised Tasks

Loading...
Thumbnail Image
File version
Author(s)
Primary Supervisor

Paliwal, Kuldip

Other Supervisors

Lisner, Peter

Editor(s)
Date
2006
Size
File type(s)
Location
License
Abstract

In this dissertation a number of novel algorithms for dimension reduction and statistical pattern recognition for both supervised and unsupervised learning tasks have been presented. Several existing pattern classifiers and dimension reduction algorithms are studied. Their limitations and/or weaknesses are considered and accordingly improved techniques are given which overcome several of their shortcomings. In particular, the following research works are carried out: • Literature survey of basic techniques for pattern classification like Gaussian mixture model (GMM), expectation-maximization (EM) algorithm, minimum distance classifier (MDC), vector quantization (VQ), nearest neighbour (NN) and k-nearest neighbour (kNN) are conducted. • Survey of basic dimensional reduction tools viz. principal component analysis (PCA) and linear discriminant analysis (LDA) are conducted. These techniques are also considered for pattern classification purposes. • Development of Fast PCA technique which finds the desired number of leading eigenvectors with much less computational cost and requires extremely low processing time as compared to the basic PCA model. • Development of gradient LDA technique which solves the small sample size problem as was not possible by basic LDA technique. • The rotational LDA technique is developed which efficiently reduces the overlapping of samples between the classes to a large extent as compared to the basic LDA technique. • A combined classifier using MDC, class-dependent PCA and LDA is designed which improves the performance of the classifier which was not possible by using single classifiers. The application of PCA prior to LDA is conducted in such a way that it avoids small sample size problem (if present). • The splitting technique initialization is introduced in the local PCA technique. The proposed integration enables easier data processing and more accurate representation of multivariate data. • A combined technique using VQ and vector quantized principal component analysis (VQPCA) is presented which provides significant improvement in the classifier performance (in terms of accuracy) at very low storage and processing time requirements compared to individual and several other classifiers. • Survey on unsupervised learning task like independent component analysis (ICA) is conducted. • A new perspective of subspace ICA (generalized ICA, where all the components need not be independent) is introduced by developing vector kurtosis (an extension of kurtosis) function.

Journal Title
Conference Title
Book Title
Edition
Volume
Issue
Thesis Type

Thesis (PhD Doctorate)

Degree Program

Doctor of Philosophy (PhD)

School

Griffith School of Engineering

Publisher link
Patent number
Funder(s)
Grant identifier(s)
Rights Statement
Rights Statement

The author owns the copyright in this thesis, unless stated otherwise.

Item Access Status

Public

Note
Access the data
Related item(s)
Subject

linear models

dimensionality reduction

statistical pattern recognition

algorithms

learning tasks

Persistent link to this record
Citation