Do sequence-trained game models encode formal rules?
Determine whether superhuman-level game-playing foundation models trained purely on move sequences (for example, transformer-based models trained on chess or Othello game records) encode the formal rules of their respective games in their internal representations, rather than achieving high performance solely through statistical pattern-matching that does not reflect the underlying rule structure.
References
It remains debated, for example, whether superhuman-level game-playing models trained purely on move sequences truly encode the game's rules40-42,47,48 3, or whether models that solve analogies at a human level possess reasoning mechanisms like those of humans49.
                — From Prediction to Understanding: Will AI Foundation Models Transform Brain Science?
                
                (2509.17280 - Serre et al., 21 Sep 2025) in Main text, section 'Prediction is not Explanation'