Show simple item record

dc.contributor.authorCai, Chenghao
dc.contributor.authorKe, Dengfeng
dc.contributor.authorXu, Yanyan
dc.contributor.authorSu, Kaile
dc.contributor.editorAmir Hussain
dc.date.accessioned2018-02-21T06:08:23Z
dc.date.available2018-02-21T06:08:23Z
dc.date.issued2015
dc.identifier.doi10.1109/IJCNN.2015.7280321
dc.identifier.urihttp://hdl.handle.net/10072/125374
dc.description.abstractIn this paper, we propose Multi-state Activation Functions (MSAFs) for Deep Neural Networks (DNNs). These multi-state functions do extra classification based on the 2-state Logistic function. Discussions on the MSAFs reveal that these activation functions have potentials for altering the parameter distribution of the DNN models, improving model performances and reducing model sizes. Meanwhile, an extension of the XOR problem indicates how neural networks with the multistate functions facilitate classifying patterns. Furthermore, basing on running average mean-normalisation rules, we actualise a combination of mean-normalised optimisation with the MSAFs as well as Singular Value Decomposition (SVD). Experimental results on TIMIT reveal that acoustic models based on DNNs can be improved by applying the MSAFs. The models obtain better phone error rates when the Logistic function is replaced with the multi-state functions. Further experiments on large vocabulary continuous speech recognition tasks reveal that the MSAFs and mean-normalised Stochastic Gradient Descent (MN-SGD) bring better recognition performances for DNNs in comparison with the conventional Logistic function and SGD learning method. Beyond this, the combination of the MSAFs, the SVD method and MN-SGD shrinks the parameter scales of DNNs to 44% approximately, leading to considerable increasing on decoding speed and decreasing on model sizes without any loss of recognition performances.
dc.description.peerreviewedYes
dc.languageEnglish
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)
dc.publisher.placeUnited States
dc.relation.ispartofconferencenameIJCNN 2015
dc.relation.ispartofconferencetitle2015 International Joint Conference on Neural Networks (IJCNN)
dc.relation.ispartofdatefrom2015-07-12
dc.relation.ispartofdateto2015-07-17
dc.relation.ispartoflocationKillarney, Ireland
dc.subject.fieldofresearchDistributed computing and systems software not elsewhere classified
dc.subject.fieldofresearchcode460699
dc.titleA Combination of Multi-state Activation Functions, Mean-normalisation and Singular Value Decomposition for learning Deep Neural Networks
dc.typeConference output
dc.type.descriptionE1 - Conferences
dc.type.codeE - Conference Publications
gro.hasfulltextNo Full Text
gro.griffith.authorSu, Kaile


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

  • Conference outputs
    Contains papers delivered by Griffith authors at national and international conferences.

Show simple item record