Dice Question Streamline Icon: https://streamlinehq.com

Coverage Conjecture: Why Complex Prompts Help More on Simple Cases

Determine whether, in chain-of-thought prompting for multi-step reasoning, using complex prompts (i.e., in-context examples with more reasoning steps) elicits reasoning capabilities that subsume and therefore better cover simpler test questions, thereby explaining the observed larger accuracy gains on cases with fewer reasoning steps.

Information Square Streamline Icon: https://streamlinehq.com

Background

The paper introduces complexity-based prompting, selecting chain-of-thought examples with more reasoning steps for in-context learning, and shows that this improves performance across several reasoning benchmarks.

In the Direction of Generalization analysis, the authors observe that complex prompts do not just help on hard problems; rather, they yield particularly notable improvements on test cases requiring fewer reasoning steps. To explain this pattern, they propose a conjecture about the nature of the capabilities elicited by complex prompts.

References

We conjecture that this is because the reasoning capabilities elicited by complex prompts may cover simple questions better.

Complexity-Based Prompting for Multi-Step Reasoning (2210.00720 - Fu et al., 2022) in Direction of Generalization, Section 4.2 (Main Results)