Thrifty Neural Architecture Search for Medical Image Segmentation (Student Abstract)
File version
Author(s)
Zhang, M
Zheng, X
Pan, S
Griffith University Author(s)
Primary Supervisor
Other Supervisors
Editor(s)
Date
Size
File type(s)
Location
Virtual
License
Abstract
Convolutional neural network (CNN) based image segmentation has been widely used in analyzing medical images and benefited many real-world disease diagnosis applications. However, existing advanced CNN-based medical image segmentation models usually contain numerous parameters that require massive computation and memory, limiting the applicability of these models in the data-constrained or hardware-constrained environments. By leveraging the recently proposed neural architecture search (NAS), this paper presents a novel approach, dubbed Thrifty NAS, to design computation and memory-efficient models for medical image segmentation automatically. The searched models by Thrifty NAS are with much fewer parameters while retaining competitive performance. More specifically, we design a micro level space for cell structure search and a macro level cell path for better network structure modeling. Extensive experimental results in different medical image datasets verify the effectiveness of the proposed method with competitive segmentation performance, especially with minuscule neural architecture model size, i.e., 0.61M that is superior to U-Net (7.76 M) and UNet++ (9.04 M).
Journal Title
Conference Title
Proceedings of the AAAI Conference on Artificial Intelligence
Book Title
Edition
Volume
36
Issue
11
Thesis Type
Degree Program
School
Publisher link
Patent number
Funder(s)
Grant identifier(s)
Rights Statement
Rights Statement
Item Access Status
Note
Access the data
Related item(s)
Subject
Computational imaging
Artificial intelligence
Neural networks
Persistent link to this record
Citation
Chen, R; Zhang, M; Zheng, X; Pan, S, Thrifty Neural Architecture Search for Medical Image Segmentation (Student Abstract), Proceedings of the AAAI Conference on Artificial Intelligence, 2022, 36 (11), pp. 12925-12926