Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Logic Synthesis with Generative Deep Neural Networks (2406.04699v1)

Published 7 Jun 2024 in cs.LO and cs.AI

Abstract: While deep learning has achieved significant success in various domains, its application to logic circuit design has been limited due to complex constraints and strict feasibility requirement. However, a recent generative deep neural model, "Circuit Transformer", has shown promise in this area by enabling equivalence-preserving circuit transformation on a small scale. In this paper, we introduce a logic synthesis rewriting operator based on the Circuit Transformer model, named "ctrw" (Circuit Transformer Rewriting), which incorporates the following techniques: (1) a two-stage training scheme for the Circuit Transformer tailored for logic synthesis, with iterative improvement of optimality through self-improvement training; (2) integration of the Circuit Transformer with state-of-the-art rewriting techniques to address scalability issues, allowing for guided DAG-aware rewriting. Experimental results on the IWLS 2023 contest benchmark demonstrate the effectiveness of our proposed rewriting methods.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Xihan Li (12 papers)
  2. Xing Li (82 papers)
  3. Lei Chen (485 papers)
  4. Xing Zhang (104 papers)
  5. Mingxuan Yuan (81 papers)
  6. Jun Wang (991 papers)
Citations (3)

Summary

We haven't generated a summary for this paper yet.