» Current | 2020 | 2019 | 2018 | Subscribe

Linley Newsletter

Photonic Computing Finds Fit in AI

December 10, 2019


Computing at the speed of light has been a pie-in-the-sky concept impractical to realize. A few image-processing niches use optical processing, but they don’t perform sequential operations as in a computer. Several industry trends are converging, however, improving the outlook for photonic computing. The most important one is machine learning, which implements well-defined algorithms in neural networks. As the end of Moore’s Law stalls digital-circuit scaling, deep neural networks have rekindled interest in analog computing, and some photonic approaches use analog calculations.

Another trend is high-volume production of silicon photonics and availability of associated design tools, driving down the cost of optical systems. The technology is shipping in high-speed communications systems such as optical modules, and it’s emerging in sensing systems such as lidar (light detection and ranging). For computing, silicon photonics enable integration of many optical functions on a single die. Building on university research, several startups are developing neural-network accelerators based on silicon photonics. One branch of these efforts focuses on neural networks that employ matrix multiplication, such as convolutional neural networks (CNNs), whereas a second is exploring a neuromorphic approach that employs spiking neural networks.

Another group of companies is pursuing free-space optics for machine learning. The free-space terminology simply connotes light traveling through air rather than some other medium such as silicon or glass fiber. These designs exploit coherent-light properties to process data in ways that are esoteric to those of us in the digital-computing industry, but some are based on decades-old optical correlation techniques. The free-space-optics companies are tapping into research sponsored by universities or the U.S. Department of Defense (DoD).

Photonic computing remains in its early days, with most designs either on paper or in proof-of-concept (PoC) prototypes. Funding is beginning to flow into this area, however, as AI-performance demands grow faster than conventional digital designs can scale.

Subscribers can view the full article in the Microprocessor Report.

Subscribe to the Microprocessor Report and always get the full story!

Purchase the full article

Free Newsletter

Linley Newsletter
Analysis of new developments in microprocessors and other semiconductor products
Subscribe to our Newsletter »


Linley Fall Processor Conference 2020
October 20-22 and 27-29, 2020 (All Times Pacific)
Register Now!
More Events »