Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Network-based Topic Interaction Map for Big Data Mining of COVID-19 Biomedical Literature (2106.07374v4)

Published 7 Jun 2021 in cs.IR and stat.AP

Abstract: Since the emergence of the worldwide pandemic of COVID-19, relevant research has been published at a dazzling pace, which yields an abundant amount of big data in biomedical literature. Due to the high volum of relevant literature, it is practically impossible to follow up the research manually. Topic modeling is a well-known unsupervised learning that aims to reveal latent topics from text data. In this paper, we propose a novel analytical framework for estimating topic interactions and effective visualization to improve topics' relationships. We first estimate topic-word distributions using the biterm topic model and estimate the topics' interaction based on the word distribution using the latent space item response model. We mapped these latent topics onto networks to visualize relationships among the topics. Moreover, in the proposed approach, we developed a score that is helpful in selecting meaningful words that characterize the topic. We figure out how topics are related by looking at how their relationships change. We do this with a "trajectory plot" that is made with different levels of word richness. These findings provide a thoroughly mined and intuitive representation of relationships between topics related to a specific research area. The application of this proposed framework to the PubMed literature demonstrates utility of our approach in understanding of the topic composition related to COVID-19 studies in the stage of its emergence.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Yeseul Jeon (9 papers)
  2. Dongjun Chung (9 papers)
  3. Jina Park (9 papers)
  4. Ick Hoon Jin (21 papers)

Summary

We haven't generated a summary for this paper yet.