A Novel Surrogate Model for Variable-Length Encoding and its Application in Optimising Deep Learning Architecture

No Thumbnail Available
File version
Author(s)
Dang, T
Nguyen, TT
McCall, J
Han, K
Liew, AWC
Griffith University Author(s)
Primary Supervisor
Other Supervisors
Editor(s)
Date
2024
Size
File type(s)
Location

Yokohama, Japan

License
Abstract

Deep neural networks (DNN) has achieved great successes across multiple domains. In recent years, a number of approaches have emerged on automatically finding the optimal DNN configurations. A technique among these approaches which show great promise is Evolutionary Algorithms (EA), which are based on observations from natural, biological processes. However, since the EA needs to evaluate multiple DNN candidates, and if the training time for a DNN is large, then the required time would be very large. A potential solution is to use Surrogate Assisted Evolutionary Algorithm (SAEA), in which a surrogate model is used to predict performance of DNNs without training. It is noted that all popular surrogate models in the literature require a fixed-length input, while encodings of a DNN are usually variable-length, since a DNN structure is very complex and its depths, sizes, etc. cannot be known beforehand. In this paper, we propose a novel surrogate model for variable-length encoding to optimise deep learning architecture. An encoder-decoder model is used to convert the variable-length encoding into a fixed-length representation, which is used as inputs to the surrogate model to predict the DNN performance without training. The weights of the encoder-decoder model are found via training on the variable-length data, with the targets being the same as the inputs, while the surrogate model is trained on the encoder output in the encoder-decoder model. In this study, a Long Short-Term Memory (LSTM) model is used as the encoder and decoder. Our proposed variable-length encoding based surrogate model is tested on a well-known method which evolves optimal Convolutional Neural Networks (CNNs). The experimental results show that our proposed method has competitive performance while significantly reducing the time of optimisation process.

Journal Title
Conference Title

2024 IEEE Congress on Evolutionary Computation (CEC)

Book Title
Edition
Volume
Issue
Thesis Type
Degree Program
School
Publisher link
Patent number
Funder(s)
Grant identifier(s)
Rights Statement
Rights Statement
Item Access Status
Note
Access the data
Related item(s)
Subject
Persistent link to this record
Citation

Dang, T; Nguyen, TT; McCall, J; Han, K; Liew, AWC, A Novel Surrogate Model for Variable-Length Encoding and its Application in Optimising Deep Learning Architecture, 2024 IEEE Congress on Evolutionary Computation (CEC), 2024