Transferable Convolutional Neural Network for Weed Mapping With Multisensor Imagery
File version
Author(s)
Jia, X
Hu, J
Zhou, J
Griffith University Author(s)
Primary Supervisor
Other Supervisors
Editor(s)
Date
Size
File type(s)
Location
License
Abstract
Automatic weed monitoring and classification are critical for effective site-specific weed management. With the increasing availability of different sensors, it is possible for weed management to be achieved by processing a wide range of images captured from various remote sensing platforms. A deep learning-based convolutional neural network (CNN) can learn the sophisticated spectral, spatial, and structural features to discriminate weed species. The challenge is to train a CNN architecture for each dataset with limited training samples. In this study, we develop a partial transferable CNN to cope with a new dataset with a different spatial resolution, a different number of bands, and variation in the signal-to-noise ratio. The goal is to make the training for each new dataset less demanding. We conducted a series of experiments on simulated image datasets from two sensors. This study reveals that the dropout layers between the convolutional layers have a significant impact for partial transferable CNN. Even-numbered subset layers from source CNN has a stronger impact on dealing with a task of different spatial resolution. For a different number of bands in source and target datasets, except for the first convolutional layer, the remaining layers are used for the analysis. Results show that network transfer is possible when the numbers of bands of the two datasets are not very different. For the variation in signal-to-noise ratio, it is found that the performance of transfer learning is acceptable when the noise level is not high. Based on these findings, experiments were conducted on two real datasets from two sensors, which includes all the variations. The comparison results using different state-of-the-art models show that partial CNN transfer with even-numbered layers provides better mapping accuracy for the target dataset with a limited number of training samples.
Journal Title
IEEE Transactions on Geoscience and Remote Sensing
Conference Title
Book Title
Edition
Volume
Issue
Thesis Type
Degree Program
School
Publisher link
Patent number
Funder(s)
Grant identifier(s)
Rights Statement
Rights Statement
Item Access Status
Note
This publication has been entered in Griffith Research Online as an advanced online version.
Access the data
Related item(s)
Subject
Photogrammetry and remote sensing
Neural networks
Geomatic engineering
Persistent link to this record
Citation
Farooq, A; Jia, X; Hu, J; Zhou, J, Transferable Convolutional Neural Network for Weed Mapping With Multisensor Imagery, IEEE Transactions on Geoscience and Remote Sensing, 2021