Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

LaMAGIC: Language-Model-based Topology Generation for Analog Integrated Circuits (2407.18269v2)

Published 19 Jul 2024 in cs.AR, cs.AI, and cs.LG

Abstract: In the realm of electronic and electrical engineering, automation of analog circuit is increasingly vital given the complexity and customized requirements of modern applications. However, existing methods only develop search-based algorithms that require many simulation iterations to design a custom circuit topology, which is usually a time-consuming process. To this end, we introduce LaMAGIC, a pioneering LLM-based topology generation model that leverages supervised finetuning for automated analog circuit design. LaMAGIC can efficiently generate an optimized circuit design from the custom specification in a single pass. Our approach involves a meticulous development and analysis of various input and output formulations for circuit. These formulations can ensure canonical representations of circuits and align with the autoregressive nature of LMs to effectively addressing the challenges of representing analog circuits as graphs. The experimental results show that LaMAGIC achieves a success rate of up to 96\% under a strict tolerance of 0.01. We also examine the scalability and adaptability of LaMAGIC, specifically testing its performance on more complex circuits. Our findings reveal the enhanced effectiveness of our adjacency matrix-based circuit formulation with floating-point input, suggesting its suitability for handling intricate circuit designs. This research not only demonstrates the potential of LLMs in graph generation, but also builds a foundational framework for future explorations in automated analog circuit design.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Chen-Chia Chang (10 papers)
  2. Shaoze Fan (1 paper)
  3. Jing Li (621 papers)
  4. Shun Zhang (105 papers)
  5. Ningyuan Cao (6 papers)
  6. Yiran Chen (176 papers)
  7. Xin Zhang (904 papers)
  8. Yikang Shen (62 papers)
Citations (3)
X Twitter Logo Streamline Icon: https://streamlinehq.com