Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

ShortCircuit: AlphaZero-Driven Circuit Design (2408.09858v2)

Published 19 Aug 2024 in cs.LG and cs.AR

Abstract: Chip design relies heavily on generating Boolean circuits, such as AND-Inverter Graphs (AIGs), from functional descriptions like truth tables. This generation operation is a key process in logic synthesis, a primary chip design stage. While recent advances in deep learning have aimed to accelerate circuit design, these efforts have mostly focused on tasks other than synthesis, and traditional heuristic methods have plateaued. In this paper, we introduce ShortCircuit, a novel transformer-based architecture that leverages the structural properties of AIGs and performs efficient space exploration. Contrary to prior approaches attempting end-to-end generation of logic circuits using deep networks, ShortCircuit employs a two-phase process combining supervised with reinforcement learning to enhance generalization to unseen truth tables. We also propose an AlphaZero variant to handle the double exponentially large state space and the reward sparsity, enabling the discovery of near-optimal designs. To evaluate the generative performance of our model , we extract 500 truth tables from a set of 20 real-world circuits. ShortCircuit successfully generates AIGs for $98\%$ of the 8-input test truth tables, and outperforms the state-of-the-art logic synthesis tool, ABC, by $18.62\%$ in terms of circuits size.

Summary

  • The paper introduces a transformer-driven synthesis method that combines supervised learning and AlphaZero-style reinforcement to generate optimized AIGs.
  • The model achieves an 84.6% success rate and reduces circuit size by approximately 18.74% compared to traditional tools like ABC.
  • The approach opens avenues for future research on multi-output circuits and integration with existing EDA tools for enhanced chip design.

An Evaluation of the ShortCircuit: A Transformer-Based Approach to Boolean Circuit Design

The paper "ShortCircuit: AlphaZero-Driven Circuit Design" proposes an innovative approach to generating Boolean circuits using a transformer-based architecture combined with reinforcement learning techniques inspired by AlphaZero. This novel methodology targets the automatic synthesis of AND-Inverter Graphs (AIGs) from functional descriptions, such as truth tables, thereby addressing an essential bottleneck in chip design and electronic design automation (EDA).

Motivation and Problem Definition

The synthesis of Boolean circuits is a critical component of modern chip design. Traditional methodologies, including heuristics and logic synthesis libraries like ABC, have plateaued in their ability to generate optimized circuits efficiently. The double exponential growth of the search space associated with Boolean functions necessitates innovative approaches to explore and optimize this space effectively. The challenge lies in balancing the trade-offs between power, performance, and area (PPA) while minimizing the size and complexity of generated circuits.

ShortCircuit Model

ShortCircuit aims to address these challenges using a two-phase process:

  1. Supervised Learning: To initialize the model and guide the generation of nodes.
  2. Reinforcement Learning with AlphaZero: To refine the model and explore the state space effectively.

The core of ShortCircuit is a transformer-based architecture designed to handle the structural properties of AIGs. The model operates by predicting the next AND node in the circuit generation process, given existing nodes and a target truth table. This prediction involves four parallel policy modules that provide probabilities for combining any pair of nodes with specific edge types. The value module estimates the quality of the current state, ensuring effective exploration and exploitation during training and inference.

Training Methodology

To address the vast search space, the authors propose a two-stage training regimen:

  1. Pre-Training: Supervised learning is employed to teach the policy network how to generate AIGs from a curated dataset of single-output AIGs. This dataset is constructed by extracting small subgraphs, or cuts, from larger circuits in the EPFL benchmark suite.
  2. Fine-Tuning: An AlphaZero-style approach fine-tunes the policy and value modules. This phase involves simulating multiple trajectories to explore the search space and refining the policy module's action probabilities based on the estimated value of states.

Experimental Evaluation

The evaluation demonstrates ShortCircuit's effectiveness using a set of 500 truth tables extracted from real-world circuits in the EPFL benchmark suite. The results underscore the significant advances made by ShortCircuit in generating optimized AIGs compared to state-of-the-art methods:

  • Success Rate: ShortCircuit successfully generates AIGs for 84.6% of the test truth tables.
  • AIG Size: The generated AIGs contain an average of 8.957 AND-nodes, which reflects an improvement of approximately 18.74% over the state-of-the-art logic synthesis tool ABC.
  • Optimization Quality: The relative size reduction of circuits generated by ShortCircuit compared to the extracted cuts and ABC demonstrates the efficacy of this approach.

Additionally, the authors conduct an ablation paper to investigate the impact of Monte Carlo Tree Search (MCTS) simulations on the success rate, circuit size, and execution time. The findings suggest a balance between the number of simulations and the performance metrics, highlighting the practical compromises inherent in the modeling process.

Implications and Future Research Directions

ShortCircuit's contributions extend the envelope of machine learning applications within EDA, showcasing the potential of transformer architectures and AlphaZero-inspired training in logic synthesis. The paper leaves a considerable impact on the field, revealing several future research directions:

  1. Multi-Output AIGs: Extending ShortCircuit to handle multiple outputs presents a logical next step, potentially broadening its applicability.
  2. Integration with Existing Tools: Exploring the integration of ShortCircuit with traditional EDA tools could enhance overall design workflows.
  3. Industrial Application: Evaluating the model’s performance in industrial settings will provide practical insights and refine its scaling and deployment strategies.

Conclusion

The paper "ShortCircuit: AlphaZero-Driven Circuit Design" presents a sophisticated approach to logic synthesis using deep learning and reinforcement learning techniques. By leveraging the strengths of transformer architectures and AlphaZero strategies, it opens new avenues for efficient and optimized circuit design, addressing critical challenges in modern chip design. The promising results and potential future developments suggest significant advancements in the pursuit of innovative and scalable computing systems.