• myGriffith
    • Staff portal
    • Contact Us⌄
      • Future student enquiries 1800 677 728
      • Current student enquiries 1800 154 055
      • International enquiries +61 7 3735 6425
      • General enquiries 07 3735 7111
      • Online enquiries
      • Staff phonebook
    View Item 
    •   Home
    • Griffith Research Online
    • Journal articles
    • View Item
    • Home
    • Griffith Research Online
    • Journal articles
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Browse

  • All of Griffith Research Online
    • Communities & Collections
    • Authors
    • By Issue Date
    • Titles
  • This Collection
    • Authors
    • By Issue Date
    • Titles
  • Statistics

  • Most Popular Items
  • Statistics by Country
  • Most Popular Authors
  • Support

  • Contact us
  • FAQs
  • Admin login

  • Login
  • Dropout with Tabu Strategy for Regularizing Deep Neural Networks

    Thumbnail
    View/Open
    Ma456654-Accepted.pdf (335.5Kb)
    File version
    Accepted Manuscript (AM)
    Author(s)
    Ma, Zongjie
    Sattar, Abdul
    Zhou, Jun
    Chen, Qingliang
    Su, Kaile
    Griffith University Author(s)
    Zhou, Jun
    Su, Kaile
    Sattar, Abdul
    Ma, Zongjie
    Chen, Qingliang
    Year published
    2020
    Metadata
    Show full item record
    Abstract
    Dropout has been proven to be an effective technique for regularizing and preventing the co-adaptation of neurons in deep neural networks (DNN). It randomly drops units with a probability of p during the training stage of DNN to avoid overfitting. The working mechanism of dropout can be interpreted as approximately and exponentially combining many different neural network architectures efficiently, leading to a powerful ensemble. In this work, we propose a novel diversification strategy for dropout, which aims at generating more different neural network architectures in less numbers of iterations. The dropped units in the ...
    View more >
    Dropout has been proven to be an effective technique for regularizing and preventing the co-adaptation of neurons in deep neural networks (DNN). It randomly drops units with a probability of p during the training stage of DNN to avoid overfitting. The working mechanism of dropout can be interpreted as approximately and exponentially combining many different neural network architectures efficiently, leading to a powerful ensemble. In this work, we propose a novel diversification strategy for dropout, which aims at generating more different neural network architectures in less numbers of iterations. The dropped units in the last forward propagation will be marked. Then the selected units for dropping in the current forward propagation will be retained if they have been marked in the last forward propagation, i.e., we only mark the units from the last forward propagation. We call this new regularization scheme Tabu dropout, whose significance lies in that it does not have extra parameters compared with the standard dropout strategy and is computationally efficient as well. Experiments conducted on four public datasets show that Tabu dropout improves the performance of the standard dropout, yielding better generalization capability.
    View less >
    Journal Title
    The Computer Journal
    Volume
    63
    Issue
    7
    DOI
    https://doi.org/10.1093/comjnl/bxz062
    Copyright Statement
    © 2020 Oxford University Press. This is a pre-copy-editing, author-produced PDF of an article accepted for publication in The Computer Journal following peer review. The definitive publisher-authenticated version Dropout with Tabu Strategy for Regularizing Deep Neural Networks, The Computer Journal, 2020, 63 (7), pp. 1031-1038 is available online at: https://doi.org/10.1093/comjnl/bxz062.
    Subject
    Information and computing sciences
    Science & Technology
    Technology
    Computer Science, Hardware & Architecture
    Computer Science, Information Systems
    Computer Science, Software Engineering
    Publication URI
    http://hdl.handle.net/10072/400664
    Collection
    • Journal articles

    Footer

    Disclaimer

    • Privacy policy
    • Copyright matters
    • CRICOS Provider - 00233E

    Tagline

    • Gold Coast
    • Logan
    • Brisbane - Queensland, Australia
    First Peoples of Australia
    • Aboriginal
    • Torres Strait Islander