• myGriffith
    • Staff portal
    • Contact Us⌄
      • Future student enquiries 1800 677 728
      • Current student enquiries 1800 154 055
      • International enquiries +61 7 3735 6425
      • General enquiries 07 3735 7111
      • Online enquiries
      • Staff phonebook
    View Item 
    •   Home
    • Griffith Research Online
    • Conference outputs
    • View Item
    • Home
    • Griffith Research Online
    • Conference outputs
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Browse

  • All of Griffith Research Online
    • Communities & Collections
    • Authors
    • By Issue Date
    • Titles
  • This Collection
    • Authors
    • By Issue Date
    • Titles
  • Statistics

  • Most Popular Items
  • Statistics by Country
  • Most Popular Authors
  • Support

  • Contact us
  • FAQs
  • Admin login

  • Login
  • A Combination of Multi-state Activation Functions, Mean-normalisation and Singular Value Decomposition for learning Deep Neural Networks

    Author(s)
    Cai, Chenghao
    Ke, Dengfeng
    Xu, Yanyan
    Su, Kaile
    Griffith University Author(s)
    Su, Kaile
    Year published
    2015
    Metadata
    Show full item record
    Abstract
    In this paper, we propose Multi-state Activation Functions (MSAFs) for Deep Neural Networks (DNNs). These multi-state functions do extra classification based on the 2-state Logistic function. Discussions on the MSAFs reveal that these activation functions have potentials for altering the parameter distribution of the DNN models, improving model performances and reducing model sizes. Meanwhile, an extension of the XOR problem indicates how neural networks with the multistate functions facilitate classifying patterns. Furthermore, basing on running average mean-normalisation rules, we actualise a combination of mean-normalised ...
    View more >
    In this paper, we propose Multi-state Activation Functions (MSAFs) for Deep Neural Networks (DNNs). These multi-state functions do extra classification based on the 2-state Logistic function. Discussions on the MSAFs reveal that these activation functions have potentials for altering the parameter distribution of the DNN models, improving model performances and reducing model sizes. Meanwhile, an extension of the XOR problem indicates how neural networks with the multistate functions facilitate classifying patterns. Furthermore, basing on running average mean-normalisation rules, we actualise a combination of mean-normalised optimisation with the MSAFs as well as Singular Value Decomposition (SVD). Experimental results on TIMIT reveal that acoustic models based on DNNs can be improved by applying the MSAFs. The models obtain better phone error rates when the Logistic function is replaced with the multi-state functions. Further experiments on large vocabulary continuous speech recognition tasks reveal that the MSAFs and mean-normalised Stochastic Gradient Descent (MN-SGD) bring better recognition performances for DNNs in comparison with the conventional Logistic function and SGD learning method. Beyond this, the combination of the MSAFs, the SVD method and MN-SGD shrinks the parameter scales of DNNs to 44% approximately, leading to considerable increasing on decoding speed and decreasing on model sizes without any loss of recognition performances.
    View less >
    Conference Title
    2015 International Joint Conference on Neural Networks (IJCNN)
    DOI
    https://doi.org/10.1109/IJCNN.2015.7280321
    Subject
    Distributed computing and systems software not elsewhere classified
    Publication URI
    http://hdl.handle.net/10072/125374
    Collection
    • Conference outputs

    Footer

    Disclaimer

    • Privacy policy
    • Copyright matters
    • CRICOS Provider - 00233E
    • TEQSA: PRV12076

    Tagline

    • Gold Coast
    • Logan
    • Brisbane - Queensland, Australia
    First Peoples of Australia
    • Aboriginal
    • Torres Strait Islander