Contrastive Decoding, a simple and training-free text generation method, has shown significant improvements over greedy decoding in a variety of reasoning tasks.
It not only prevents some abstract reasoning errors and avoids simpler modes such as copying sections of the input, but also outperforms nucleus sampling for long-form generation and greedy decoding for reasoning tasks.