• myGriffith
    • Staff portal
    • Contact Us⌄
      • Future student enquiries 1800 677 728
      • Current student enquiries 1800 154 055
      • International enquiries +61 7 3735 6425
      • General enquiries 07 3735 7111
      • Online enquiries
      • Staff phonebook
    View Item 
    •   Home
    • Griffith Research Online
    • Journal articles
    • View Item
    • Home
    • Griffith Research Online
    • Journal articles
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Browse

  • All of Griffith Research Online
    • Communities & Collections
    • Authors
    • By Issue Date
    • Titles
  • This Collection
    • Authors
    • By Issue Date
    • Titles
  • Statistics

  • Most Popular Items
  • Statistics by Country
  • Most Popular Authors
  • Support

  • Contact us
  • FAQs
  • Admin login

  • Login
  • An evidential combination method with multi-color spaces for remote sensing image scene classification

    Author(s)
    Huang, Linqing
    Zhao, Wangbo
    Liew, Alan Wee-Chung
    You, Yang
    Griffith University Author(s)
    Liew, Alan Wee-Chung
    Year published
    2023
    Metadata
    Show full item record
    Abstract
    Remote sensing image scene classification aims to commit the semantic labels according to the content of images. Convolutional Neural Network (CNN) is often used here to extract deep discriminative feature of remote sensing images for classification. In practice, CNN is usually trained by images in the Red Green Blue (RGB) color space. Whereas, CNN also can be trained by images in some other color spaces, e.g., Hue Saturation Value. The CNN models trained by images in diverse color spaces will perform differently because different color spaces often emphasize diverse color information. Thus, we present an Evidential Combination ...
    View more >
    Remote sensing image scene classification aims to commit the semantic labels according to the content of images. Convolutional Neural Network (CNN) is often used here to extract deep discriminative feature of remote sensing images for classification. In practice, CNN is usually trained by images in the Red Green Blue (RGB) color space. Whereas, CNN also can be trained by images in some other color spaces, e.g., Hue Saturation Value. The CNN models trained by images in diverse color spaces will perform differently because different color spaces often emphasize diverse color information. Thus, we present an Evidential Combination method with Multi-color Spaces (ECMS) to integrate the complementary information of different color spaces for classification performance improvement. In ECMS, labeled remote sensing images in the RGB color space are first converted into other color spaces, and then they are used to train CNN models, respectively. The soft classification results (of query images) yielded by these CNN models are combined by evidence theory. During fusion, the reliabilities/weights of these outputs of different CNN models are usually different, so they should not be equally treated for combination. In our approach, the weights are learnt by minimizing the mean squared error between the combination results and ground truth on labeled images. By doing this, weighted evidence combination of soft classification results is employed to make scene class decision. We conducted experiments on several datasets to verify the effectiveness of ECMS, and the results show ECMS can significantly improve classification accuracy compared with many existing methods.
    View less >
    Journal Title
    Information Fusion
    Volume
    93
    DOI
    https://doi.org/10.1016/j.inffus.2022.12.025
    Publication URI
    http://hdl.handle.net/10072/421125
    Collection
    • Journal articles

    Footer

    Disclaimer

    • Privacy policy
    • Copyright matters
    • CRICOS Provider - 00233E
    • TEQSA: PRV12076

    Tagline

    • Gold Coast
    • Logan
    • Brisbane - Queensland, Australia
    First Peoples of Australia
    • Aboriginal
    • Torres Strait Islander