Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
194 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Bayesian Photonic Accelerators for Energy Efficient and Noise Robust Neural Processing (2203.15806v2)

Published 29 Mar 2022 in cs.ET and physics.optics

Abstract: Artificial neural networks are efficient computing platforms inspired by the brain. Such platforms can tackle a vast area of real-life tasks ranging from image processing to language translation. Silicon photonic integrated chips (PICs), by employing coherent interactions in Mach-Zehnder interferometers, are promising accelerators offering record low power consumption and ultra-fast matrix multiplication. Such photonic accelerators, however, suffer from phase uncertainty due to fabrication errors and crosstalk effects that inhibit the development of high-density implementations. In this work, we present a Bayesian learning framework for such photonic accelerators. In addition to the conventional log-likelihood optimization path, two novel training schemes are derived, namely a regularized version and a fully Bayesian learning scheme. They are applied on a photonic neural network with 512 phase shifters targeting the MNIST dataset. The new schemes, when combined with a pre-characterization stage that provides the passive offsets, are able to dramatically decrease the operational power of the PIC beyond 70%, with just a slight loss in classification accuracy. The full Bayesian scheme, apart from this energy reduction, returns information with respect to the sensitivity of the phase shifters. This information is used to de-activate 31% of the phase actuators and, thus, significantly simplify the driving system.

Citations (6)

Summary

We haven't generated a summary for this paper yet.