• myGriffith
    • Staff portal
    • Contact Us⌄
      • Future student enquiries 1800 677 728
      • Current student enquiries 1800 154 055
      • International enquiries +61 7 3735 6425
      • General enquiries 07 3735 7111
      • Online enquiries
      • Staff phonebook
    View Item 
    •   Home
    • Griffith Theses
    • Theses - Higher Degree by Research
    • View Item
    • Home
    • Griffith Theses
    • Theses - Higher Degree by Research
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Browse

  • All of Griffith Research Online
    • Communities & Collections
    • Authors
    • By Issue Date
    • Titles
  • This Collection
    • Authors
    • By Issue Date
    • Titles
  • Statistics

  • Most Popular Items
  • Statistics by Country
  • Most Popular Authors
  • Support

  • Contact us
  • FAQs
  • Admin login

  • Login
  • Linear Models for Dimensionality Reduction and Statistical Pattern Recognition for Supervised and Unsupervised Tasks

    Thumbnail
    View/Open
    02Whole.pdf (2.138Mb)
    Author(s)
    Sharma, Alok
    Primary Supervisor
    Paliwal, Kuldip
    Other Supervisors
    Lisner, Peter
    Year published
    2006
    Metadata
    Show full item record
    Abstract
    In this dissertation a number of novel algorithms for dimension reduction and statistical pattern recognition for both supervised and unsupervised learning tasks have been presented. Several existing pattern classifiers and dimension reduction algorithms are studied. Their limitations and/or weaknesses are considered and accordingly improved techniques are given which overcome several of their shortcomings. In particular, the following research works are carried out: • Literature survey of basic techniques for pattern classification like Gaussian mixture model (GMM), expectation-maximization (EM) algorithm, minimum distance ...
    View more >
    In this dissertation a number of novel algorithms for dimension reduction and statistical pattern recognition for both supervised and unsupervised learning tasks have been presented. Several existing pattern classifiers and dimension reduction algorithms are studied. Their limitations and/or weaknesses are considered and accordingly improved techniques are given which overcome several of their shortcomings. In particular, the following research works are carried out: • Literature survey of basic techniques for pattern classification like Gaussian mixture model (GMM), expectation-maximization (EM) algorithm, minimum distance classifier (MDC), vector quantization (VQ), nearest neighbour (NN) and k-nearest neighbour (kNN) are conducted. • Survey of basic dimensional reduction tools viz. principal component analysis (PCA) and linear discriminant analysis (LDA) are conducted. These techniques are also considered for pattern classification purposes. • Development of Fast PCA technique which finds the desired number of leading eigenvectors with much less computational cost and requires extremely low processing time as compared to the basic PCA model. • Development of gradient LDA technique which solves the small sample size problem as was not possible by basic LDA technique. • The rotational LDA technique is developed which efficiently reduces the overlapping of samples between the classes to a large extent as compared to the basic LDA technique. • A combined classifier using MDC, class-dependent PCA and LDA is designed which improves the performance of the classifier which was not possible by using single classifiers. The application of PCA prior to LDA is conducted in such a way that it avoids small sample size problem (if present). • The splitting technique initialization is introduced in the local PCA technique. The proposed integration enables easier data processing and more accurate representation of multivariate data. • A combined technique using VQ and vector quantized principal component analysis (VQPCA) is presented which provides significant improvement in the classifier performance (in terms of accuracy) at very low storage and processing time requirements compared to individual and several other classifiers. • Survey on unsupervised learning task like independent component analysis (ICA) is conducted. • A new perspective of subspace ICA (generalized ICA, where all the components need not be independent) is introduced by developing vector kurtosis (an extension of kurtosis) function.
    View less >
    Thesis Type
    Thesis (PhD Doctorate)
    Degree Program
    Doctor of Philosophy (PhD)
    School
    Griffith School of Engineering
    DOI
    https://doi.org/10.25904/1912/3743
    Copyright Statement
    The author owns the copyright in this thesis, unless stated otherwise.
    Item Access Status
    Public
    Subject
    linear models
    dimensionality reduction
    statistical pattern recognition
    algorithms
    learning tasks
    Publication URI
    http://hdl.handle.net/10072/365298
    Collection
    • Theses - Higher Degree by Research

    Footer

    Disclaimer

    • Privacy policy
    • Copyright matters
    • CRICOS Provider - 00233E
    • TEQSA: PRV12076

    Tagline

    • Gold Coast
    • Logan
    • Brisbane - Queensland, Australia
    First Peoples of Australia
    • Aboriginal
    • Torres Strait Islander