Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Natural Language to Verilog: Design of a Recurrent Spiking Neural Network using Large Language Models and ChatGPT (2405.01419v3)

Published 2 May 2024 in cs.AR and cs.AI

Abstract: This paper investigates the use of LLMs and natural language prompts to generate hardware description code, namely Verilog. Building on our prior work, we employ OpenAI's ChatGPT4 and natural language prompts to synthesize an RTL Verilog module of a programmable recurrent spiking neural network, while also generating test benches to assess the system's correctness. The resultant design was validated in three simple machine learning tasks, the exclusive OR, the IRIS flower classification and the MNIST hand-written digit classification. Furthermore, the design was validated on a Field-Programmable Gate Array (FPGA) and subsequently synthesized in the SkyWater 130 nm technology by using an open-source electronic design automation flow. The design was submitted to Efabless Tiny Tapeout 6.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Paola Vitolo (1 paper)
  2. George Psaltakis (1 paper)
  3. Michael Tomlinson (2 papers)
  4. Gian Domenico Licciardo (1 paper)
  5. Andreas G. Andreou (4 papers)

Summary

We haven't generated a summary for this paper yet.