Live
Black Hat USAAI BusinessBlack Hat AsiaAI BusinessGeopolitics, AI, and Cybersecurity: Insights From RSAC 2026Dark Readingbuilding an atomic bomberman clone, part 4: react vs. the game loopDEV CommunityWhy My "Lightning Fast" Spring Boot Native App Took 9 Seconds to Boot on Fly.ioDEV CommunityThis International Fact-Checking Day, use these 5 tips to spot AI-generated contentFast Company TechShow HN: A task market where AI agents post work, claim it, and build reputationHacker News AI TopA quiz that scores your job's AI replacement risk (Anthropic/ILO/OECD data)Hacker News AI TopHow I'm Using an AI Assistant to Offload the "Meta-Work" of My DayHacker News AI TopWhat distinguishes great engineers when AI writes the code?Hacker News AI TopCursor AI agent admits to deceiving user during 61GB RAM overflowHacker News AI TopOur AI agent tried to read our .env file 30 seconds inHacker News AI TopSuits Against Tempus AI Test Legal Lines for Mining Genetic DataHacker News AI TopBuilding HIPAA-Compliant Software for Dental Practices: What Developers Need to KnowDEV CommunityBlack Hat USAAI BusinessBlack Hat AsiaAI BusinessGeopolitics, AI, and Cybersecurity: Insights From RSAC 2026Dark Readingbuilding an atomic bomberman clone, part 4: react vs. the game loopDEV CommunityWhy My "Lightning Fast" Spring Boot Native App Took 9 Seconds to Boot on Fly.ioDEV CommunityThis International Fact-Checking Day, use these 5 tips to spot AI-generated contentFast Company TechShow HN: A task market where AI agents post work, claim it, and build reputationHacker News AI TopA quiz that scores your job's AI replacement risk (Anthropic/ILO/OECD data)Hacker News AI TopHow I'm Using an AI Assistant to Offload the "Meta-Work" of My DayHacker News AI TopWhat distinguishes great engineers when AI writes the code?Hacker News AI TopCursor AI agent admits to deceiving user during 61GB RAM overflowHacker News AI TopOur AI agent tried to read our .env file 30 seconds inHacker News AI TopSuits Against Tempus AI Test Legal Lines for Mining Genetic DataHacker News AI TopBuilding HIPAA-Compliant Software for Dental Practices: What Developers Need to KnowDEV Community
AI NEWS HUBbyEIGENVECTOREigenvector

A single beam of light runs AI with supercomputer power

ScienceDaily RoboticsNovember 16, 20251 min read0 views
Source Quiz

Aalto University researchers have developed a method to execute AI tensor operations using just one pass of light. By encoding data directly into light waves, they enable calculations to occur naturally and simultaneously. The approach works passively, without electronics, and could soon be integrated into photonic chips. If adopted, it promises dramatically faster and more energy-efficient AI systems.

Tensor operations are a form of advanced mathematics that support many modern technologies, especially artificial intelligence. These operations go far beyond the simple calculations most people encounter. A helpful way to picture them is to imagine manipulating a Rubik's cube in several dimensions at once by rotating, slicing, or rearranging its layers. Humans and traditional computers must break these tasks into sequences, but light can perform all of them at the same time.

Today, tensor operations are essential for AI systems involved in image processing, language understanding, and countless other tasks. As the amount of data continues to grow, conventional digital hardware such as GPUs faces increasing strain in speed, energy use, and scalability.

Researchers Demonstrate Single-Shot Tensor Computing With Light

To address these challenges, an international team led by Dr. Yufeng Zhang from the Photonics Group at Aalto University's Department of Electronics and Nanoengineering has developed a fundamentally new approach. Their method allows complex tensor calculations to be completed within a single movement of light through an optical system. The process, described as single-shot tensor computing, functions at the speed of light.

"Our method performs the same kinds of operations that today's GPUs handle, like convolutions and attention layers, but does them all at the speed of light," says Dr. Zhang. "Instead of relying on electronic circuits, we use the physical properties of light to perform many computations simultaneously."

Encoding Information Into Light for High-Speed Computation

The team accomplished this by embedding digital information into the amplitude and phase of light waves, transforming numerical data into physical variations within the optical field. As these light waves interact, they automatically carry out mathematical procedures such as matrix and tensor multiplication, which form the basis of deep learning. By working with multiple wavelengths of light, the researchers expanded their technique to support even more complex, higher-order tensor operations.

"Imagine you're a customs officer who must inspect every parcel through multiple machines with different functions and then sort them into the right bins," Zhang says. "Normally, you'd process each parcel one by one. Our optical computing method merges all parcels and all machines together -- we create multiple 'optical hooks' that connect each input to its correct output. With just one operation, one pass of light, all inspections and sorting happen instantly and in parallel."

Passive Optical Processing With Wide Compatibility

One of the most striking benefits of this method is how little intervention it requires. The necessary operations occur on their own as the light travels, so the system does not need active control or electronic switching during computation.

"This approach can be implemented on almost any optical platform," says Professor Zhipei Sun, leader of Aalto University's Photonics Group. "In the future, we plan to integrate this computational framework directly onto photonic chips, enabling light-based processors to perform complex AI tasks with extremely low power consumption."

Path Toward Future Light-Based AI Hardware

Zhang notes that the ultimate objective is to adapt the technique to existing hardware and platforms used by major technology companies. He estimates that the method could be incorporated into such systems within 3 to 5 years.

"This will create a new generation of optical computing systems, significantly accelerating complex AI tasks across a myriad of fields," he concludes.

The study was published in Nature Photonics on November 14th, 2025.

Was this article helpful?

Sign in to highlight and annotate this article

AI
Ask AI about this article
Powered by Eigenvector · full article context loaded
Ready

Conversation starters

Ask anything about this article…

Daily AI Digest

Get the top 5 AI stories delivered to your inbox every morning.

Knowledge Map

Knowledge Map
TopicsEntitiesSource
A single be…researchScienceDail…

Connected Articles — Knowledge Graph

This article is connected to other articles through shared AI topics and tags.

Knowledge Graph100 articles · 126 connections
Scroll to zoom · drag to pan · click to open

Discussion

Sign in to join the discussion

No comments yet — be the first to share your thoughts!

More in Products