Furrow Mapping of Sugarcane Billet Density Using Deep Learning and Object Detection
File version
Accepted Manuscript (AM)
Author(s)
Busch, A
Griffith University Author(s)
Primary Supervisor
Other Supervisors
Editor(s)
Date
Size
File type(s)
Location
Melbourne, Australia
License
Abstract
Australia's sugar industry is currently undergoing significant hardships, due to global market contractions from COVID-19, increased crop forecasts from larger global producers, and falling oil prices. Current planting practices utilize inefficient mass-flow planting techniques, and no attempt to map the seed using machine vision has been made, to date, in order to understand the underlying problems. This paper investigates the plausibility of creating a labeled sugarcane billet dataset using a readily-available camera positioned beneath a planter and analysing this using a YOLOv3 network. This network resulted in a high mean average precision at intersect over union of 0.5 (mAP50) of 0.852 on test images, and was used to provide planting metrics by generating a furrow map.
Journal Title
Conference Title
2020 Digital Image Computing: Techniques and Applications, DICTA 2020
Book Title
Edition
Volume
Issue
Thesis Type
Degree Program
School
Publisher link
Patent number
Funder(s)
ARC
Grant identifier(s)
IH180100002
Rights Statement
Rights Statement
© 2020 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
Item Access Status
Note
Access the data
Related item(s)
Subject
Computer vision
Agriculture, land and farm management
Persistent link to this record
Citation
Scott, J; Busch, A, Furrow Mapping of Sugarcane Billet Density Using Deep Learning and Object Detection, 2020 Digital Image Computing: Techniques and Applications, DICTA 2020, 2020