TY - JOUR
T1 - Extracting chemical-protein relations using attention-based neural networks
AU - Liu, Sijia
AU - Shen, Feichen
AU - Komandur Elayavilli, Ravikumar
AU - Wang, Yanshan
AU - Rastegar-Mojarad, Majid
AU - Chaudhary, Vipin
AU - Liu, Hongfang
N1 - Funding Information:
US National Institutes of Health (R01 LM011829); National Science Foundation IPA grant.
Publisher Copyright:
© The Author(s) 2018. Published by Oxford University Press.
PY - 2018/1/1
Y1 - 2018/1/1
N2 - Relation extraction is an important task in the field of natural language processing. In this paper, we describe our approach for the BioCreative VI Task 5: text mining chemical-protein interactions. We investigate multiple deep neural network (DNN) models, including convolutional neural networks, recurrent neural networks (RNNs) and attention-based (ATT-) RNNs (ATT-RNNs) to extract chemical-protein relations. Our experimental results indicate that ATT-RNN models outperform the same models without using attention and the ATT-gated recurrent unit (ATT-GRU) achieves the best performing micro average F1 score of 0.527 on the test set among the tested DNNs. In addition, the result of word-level attention weights also shows that attention mechanism is effective on selecting the most important trigger words when trained with semantic relation labels without the need of semantic parsing and feature engineering. The source code of this work is available at https://github.com/ohnlp/att-chemprot.
AB - Relation extraction is an important task in the field of natural language processing. In this paper, we describe our approach for the BioCreative VI Task 5: text mining chemical-protein interactions. We investigate multiple deep neural network (DNN) models, including convolutional neural networks, recurrent neural networks (RNNs) and attention-based (ATT-) RNNs (ATT-RNNs) to extract chemical-protein relations. Our experimental results indicate that ATT-RNN models outperform the same models without using attention and the ATT-gated recurrent unit (ATT-GRU) achieves the best performing micro average F1 score of 0.527 on the test set among the tested DNNs. In addition, the result of word-level attention weights also shows that attention mechanism is effective on selecting the most important trigger words when trained with semantic relation labels without the need of semantic parsing and feature engineering. The source code of this work is available at https://github.com/ohnlp/att-chemprot.
UR - http://www.scopus.com/inward/record.url?scp=85054456100&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85054456100&partnerID=8YFLogxK
U2 - 10.1093/database/bay102
DO - 10.1093/database/bay102
M3 - Article
C2 - 30295724
AN - SCOPUS:85054456100
SN - 1758-0463
VL - 2018
JO - Database
JF - Database
IS - 2018
ER -