Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

TeamTat: a collaborative text annotation tool (2004.11894v1)

Published 24 Apr 2020 in cs.HC and cs.IR

Abstract: Manually annotated data is key to developing text-mining and information-extraction algorithms. However, human annotation requires considerable time, effort and expertise. Given the rapid growth of biomedical literature, it is paramount to build tools that facilitate speed and maintain expert quality. While existing text annotation tools may provide user-friendly interfaces to domain experts, limited support is available for image display, project management, and multi-user team annotation. In response, we developed TeamTat (teamtat.org), a web-based annotation tool (local setup available), equipped to manage team annotation projects engagingly and efficiently. TeamTat is a novel tool for managing multi-user, multi-label document annotation, reflecting the entire production life cycle. Project managers can specify annotation schema for entities and relations and select annotator(s) and distribute documents anonymously to prevent bias. Document input format can be plain text, PDF or BioC, (uploaded locally or automatically retrieved from PubMed or PMC), and output format is BioC with inline annotations. TeamTat displays figures from the full text for the annotators convenience. Multiple users can work on the same document independently in their workspaces, and the team manager can track task completion. TeamTat provides corpus-quality assessment via inter-annotator agreement statistics, and a user-friendly interface convenient for annotation review and inter-annotator disagreement resolution to improve corpus quality.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Rezarta Islamaj (5 papers)
  2. Dongseop Kwon (1 paper)
  3. Sun Kim (26 papers)
  4. Zhiyong Lu (113 papers)
Citations (50)