Reinforced self-attention network: A hybrid of hard and soft attention for sequence modeling

Loading...
Thumbnail Image
File version

Version of Record (VoR)

Author(s)
Shen, T
Zhou, T
Long, G
Jiang, J
Wang, S
Zhang, C
Griffith University Author(s)
Primary Supervisor
Other Supervisors
Editor(s)
Date
2018
Size
File type(s)
Location
License
Abstract

Many natural language processing tasks solely rely on sparse dependencies between a few tokens in a sentence. Soft attention mechanisms show promising performance in modeling local/global dependencies by soft probabilities between every two tokens, but they are not effective and efficient when applied to long sentences. By contrast, hard attention mechanisms directly select a subset of tokens but are difficult and inefficient to train due to their combinatorial nature. In this paper, we integrate both soft and hard attention into one context fusion model, “reinforced self-attention (ReSA)”, for the mutual benefit of each other. In ReSA, a hard attention trims a sequence for a soft self-attention to process, while the soft attention feeds reward signals back to facilitate the training of the hard one. For this purpose, we develop a novel hard attention called “reinforced sequence sampling (RSS)”, selecting tokens in parallel and trained via policy gradient. Using two RSS modules, ReSA efficiently extracts the sparse dependencies between each pair of selected tokens. We finally propose an RNN/CNN-free sentence-encoding model, “reinforced self-attention network (ReSAN)”, solely based on ReSA. It achieves state-of-the-art performance on both Stanford Natural Language Inference (SNLI) and Sentences Involving Compositional Knowledge (SICK) datasets.

Journal Title
Conference Title

IJCAI International Joint Conference on Artificial Intelligence

Book Title
Edition
Volume

2018-July

Issue
Thesis Type
Degree Program
School
Publisher link
Patent number
Funder(s)
Grant identifier(s)
Rights Statement
Rights Statement

© 2018 International Joint Conference on Artificial Intelligence. The attached file is reproduced here in accordance with the copyright policy of the publisher. Please refer to the Conference's website for access to the definitive, published version.

Item Access Status
Note
Access the data
Related item(s)
Subject

Information and computing sciences

Persistent link to this record
Citation