Transformers in RNA structure prediction: A review
File version
Version of Record (VoR)
Author(s)
Rashid, Mahmood A
Paliwal, Kuldip K
Griffith University Author(s)
Primary Supervisor
Other Supervisors
Editor(s)
Date
Size
File type(s)
Location
Abstract
The Transformer is a deep neural network based on the self-attention mechanism, designed to handle sequential data. Given its tremendous advantages in natural language processing, it has gained traction for other applications. As the primary structure of RNA is a sequence of nucleotides, researchers have applied Transformers to predict secondary and tertiary structures from RNA sequences. The number of Transformer-based models in structure prediction tasks is rapidly increasing as they have performed on par or better than other deep learning networks, such as Convolutional and Recurrent Neural Networks. This article thoroughly examines Transformer-based RNA structure prediction models. Through an in-depth analysis of the models, we aim to explain how their architectural innovations improve their performances and what they still lack. As Transformer-based techniques for RNA structure prediction continue to evolve, this review serves as both a record of past achievements and a guide for future avenues.
Journal Title
Computational and Structural Biotechnology Journal
Conference Title
Book Title
Edition
Volume
27
Issue
Thesis Type
Degree Program
School
Publisher link
Patent number
Funder(s)
Grant identifier(s)
Rights Statement
Rights Statement
© 2025 The Authors. Published by Elsevier B.V. on behalf of Research Network of Computational and Structural Biotechnology. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).
Item Access Status
Note
Access the data
Related item(s)
Subject
Persistent link to this record
Citation
Chaturvedi, M; Rashid, MA; Paliwal, KK, Transformers in RNA structure prediction: A review, Computational and Structural Biotechnology Journal, 2025, 27, pp. 1187-1203