Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Metaphorical Paraphrase Generation: Feeding Metaphorical Language Models with Literal Texts (2210.04756v2)

Published 10 Oct 2022 in cs.CL

Abstract: This study presents a new approach to metaphorical paraphrase generation by masking literal tokens of literal sentences and unmasking them with metaphorical LLMs. Unlike similar studies, the proposed algorithm does not only focus on verbs but also on nouns and adjectives. Despite the fact that the transfer rate for the former is the highest (56%), the transfer of the latter is feasible (24% and 31%). Human evaluation showed that our system-generated metaphors are considered more creative and metaphorical than human-generated ones while when using our transferred metaphors for data augmentation improves the state of the art in metaphorical sentence classification by 3% in F1.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Giorgio Ottolina (2 papers)
  2. John Pavlopoulos (31 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.