Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Attend and select: A segment selective transformer for microblog hashtag generation (2106.03151v2)

Published 6 Jun 2021 in cs.CL, cs.AI, and cs.IR

Abstract: Hashtag generation aims to generate short and informal topical tags from a microblog post, in which tokens or phrases form the hashtags. These tokens or phrases may originate from primary fragmental textual pieces (e.g., segments) in the original text and are separated into different segments. However, conventional sequence-to-sequence generation methods are hard to filter out secondary information from different textual granularity and are not good at selecting crucial tokens. Thus, they are suboptimal in generating more condensed hashtags. In this work, we propose a modified Transformer-based generation model with adding a segments-selection procedure for the original encoding and decoding phases. The segments-selection phase is based on a novel Segments Selection Mechanism (SSM) to model different textual granularity on global text, local segments, and tokens, contributing to generating condensed hashtags. Specifically, it first attends to primary semantic segments and then transforms discontinuous segments from the source text into a sequence of hashtags by selecting crucial tokens. Extensive evaluations on the two datasets reveal our approach's superiority with significant improvements to the extraction and generation baselines. The code and datasets are available at https://github.com/OpenSUM/HashtagGen.

Citations (3)

Summary

We haven't generated a summary for this paper yet.

Github Logo Streamline Icon: https://streamlinehq.com

GitHub