Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Text Mining Drug/Chemical-Protein Interactions using an Ensemble of BERT and T5 Based Models (2111.15617v1)

Published 30 Nov 2021 in cs.CL

Abstract: In Track-1 of the BioCreative VII Challenge participants are asked to identify interactions between drugs/chemicals and proteins. In-context named entity annotations for each drug/chemical and protein are provided and one of fourteen different interactions must be automatically predicted. For this relation extraction task, we attempt both a BERT-based sentence classification approach, and a more novel text-to-text approach using a T5 model. We find that larger BERT-based models perform better in general, with our BioMegatron-based model achieving the highest scores across all metrics, achieving 0.74 F1 score. Though our novel T5 text-to-text method did not perform as well as most of our BERT-based models, it outperformed those trained on similar data, showing promising results, achieving 0.65 F1 score. We believe a text-to-text approach to relation extraction has some competitive advantages and there is a lot of room for research advancement.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Virginia Adams (8 papers)
  2. Hoo-Chang Shin (17 papers)
  3. Carol Anderson (5 papers)
  4. Bo Liu (484 papers)
  5. Anas Abidin (4 papers)
Citations (3)

Summary

We haven't generated a summary for this paper yet.