Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Large-scale photonic natural language processing (2208.13649v1)

Published 29 Aug 2022 in cs.ET and physics.optics

Abstract: Modern machine learning applications require huge artificial networks demanding in computational power and memory. Light-based platforms promise ultra-fast and energy-efficient hardware, which may help in realizing next-generation data processing devices. However, current photonic networks are limited by the number of input-output nodes that can be processed in a single shot. This restricted network capacity prevents their application to relevant large-scale problems such as natural language processing. Here, we realize a photonic processor with a capacity exceeding $1.5 \times 10{10}$ optical nodes, more than one order of magnitude larger than any previous implementation, which enables photonic large-scale text encoding and classification. By exploiting the full three-dimensional structure of the optical field propagating in free space, we overcome the interpolation threshold and reach the over-parametrized region of machine learning, a condition that allows high-performance natural language processing with a minimal fraction of training points. Our results provide a novel solution to scale-up light-driven computing and open the route to photonic language processing.

Citations (11)

Summary

We haven't generated a summary for this paper yet.