Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Manifold-based Verbalizer Space Re-embedding for Tuning-free Prompt-based Classification (2309.04174v2)

Published 8 Sep 2023 in cs.CL and cs.AI

Abstract: Prompt-based classification adapts tasks to a cloze question format utilizing the [MASK] token and the filled tokens are then mapped to labels through pre-defined verbalizers. Recent studies have explored the use of verbalizer embeddings to reduce labor in this process. However, all existing studies require a tuning process for either the pre-trained models or additional trainable embeddings. Meanwhile, the distance between high-dimensional verbalizer embeddings should not be measured by Euclidean distance due to the potential for non-linear manifolds in the representation space. In this study, we propose a tuning-free manifold-based space re-embedding method called Locally Linear Embedding with Intra-class Neighborhood Constraint (LLE-INC) for verbalizer embeddings, which preserves local properties within the same class as guidance for classification. Experimental results indicate that even without tuning any parameters, our LLE-INC is on par with automated verbalizers with parameter tuning. And with the parameter updating, our approach further enhances prompt-based tuning by up to 3.2%. Furthermore, experiments with the LLaMA-7B&13B indicate that LLE-INC is an efficient tuning-free classification approach for the hyper-scale LLMs.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Haochun Wang (17 papers)
  2. Sendong Zhao (31 papers)
  3. Chi Liu (65 papers)
  4. Nuwa Xi (11 papers)
  5. Muzhen Cai (5 papers)
  6. Bing Qin (186 papers)
  7. Ting Liu (329 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.