Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

$k$-Neighbor Based Curriculum Sampling for Sequence Prediction (2101.09313v1)

Published 22 Jan 2021 in cs.CL and cs.LG

Abstract: Multi-step ahead prediction in LLMs is challenging due to the discrepancy between training and test time processes. At test time, a sequence predictor is required to make predictions given past predictions as the input, instead of the past targets that are provided during training. This difference, known as exposure bias, can lead to the compounding of errors along a generated sequence at test time. To improve generalization in neural LLMs and address compounding errors, we propose \textit{Nearest-Neighbor Replacement Sampling} -- a curriculum learning-based method that gradually changes an initially deterministic teacher policy to a stochastic policy. A token at a given time-step is replaced with a sampled nearest neighbor of the past target with a truncated probability proportional to the cosine similarity between the original word and its top $k$ most similar words. This allows the learner to explore alternatives when the current policy provided by the teacher is sub-optimal or difficult to learn from. The proposed method is straightforward, online and requires little additional memory requirements. We report our findings on two LLMling benchmarks and find that the proposed method further improves performance when used in conjunction with scheduled sampling.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. James O' Neill (17 papers)
  2. Danushka Bollegala (84 papers)

Summary

We haven't generated a summary for this paper yet.