Dropout with Tabu Strategy for Regularizing Deep Neural Networks

Loading...
Thumbnail Image
File version

Accepted Manuscript (AM)

Author(s)
Ma, Zongjie
Sattar, Abdul
Zhou, Jun
Chen, Qingliang
Su, Kaile
Griffith University Author(s)
Primary Supervisor
Other Supervisors
Editor(s)
Date
2020
Size
File type(s)
Location
License
Abstract

Dropout has been proven to be an effective technique for regularizing and preventing the co-adaptation of neurons in deep neural networks (DNN). It randomly drops units with a probability of p during the training stage of DNN to avoid overfitting. The working mechanism of dropout can be interpreted as approximately and exponentially combining many different neural network architectures efficiently, leading to a powerful ensemble. In this work, we propose a novel diversification strategy for dropout, which aims at generating more different neural network architectures in less numbers of iterations. The dropped units in the last forward propagation will be marked. Then the selected units for dropping in the current forward propagation will be retained if they have been marked in the last forward propagation, i.e., we only mark the units from the last forward propagation. We call this new regularization scheme Tabu dropout, whose significance lies in that it does not have extra parameters compared with the standard dropout strategy and is computationally efficient as well. Experiments conducted on four public datasets show that Tabu dropout improves the performance of the standard dropout, yielding better generalization capability.

Journal Title

The Computer Journal

Conference Title
Book Title
Edition
Volume

63

Issue

7

Thesis Type
Degree Program
School
Publisher link
Patent number
Funder(s)
Grant identifier(s)
Rights Statement
Rights Statement

© 2020 Oxford University Press. This is a pre-copy-editing, author-produced PDF of an article accepted for publication in The Computer Journal following peer review. The definitive publisher-authenticated version Dropout with Tabu Strategy for Regularizing Deep Neural Networks, The Computer Journal, 2020, 63 (7), pp. 1031-1038 is available online at: https://doi.org/10.1093/comjnl/bxz062.

Item Access Status
Note
Access the data
Related item(s)
Subject

Information and computing sciences

Science & Technology

Technology

Computer Science, Hardware & Architecture

Computer Science, Information Systems

Computer Science, Software Engineering

Persistent link to this record
Citation

Ma, Z; Sattar, A; Zhou, J; Chen, Q; Su, K, Dropout with Tabu Strategy for Regularizing Deep Neural Networks, The Computer Journal, 2020, 63 (7), pp. 1031-1038

Collections