Computing with light using a photonic Native Processing Unit (NPU) leads to energy and speed savings, especially in computationally intensive applications. The NPU presented by Q.ANT is equipped with a standard PCIe interface.
Computing with photons for AI applications: The photonic Native Processing Unit (NPU) was able to impress in initial benchmark tests.
With the introduction of the Photonic Native Processing Unit (NPU) by Q.ANT via the standard PCIe interface, computing technology will change in the coming years. For electronics developers, this means working not only more efficiently, but also more sustainably. But what exactly makes the photonic architecture so special, and why is it relevant to the electronics industry?
When light plays a significant role
The photonic architecture fundamentally differs from classic CMOS processors: instead of processing data by switching transistors, the NPU uses light as a computing medium. This brings significant advantages.
"Processing with light significantly reduces energy consumption, as there are no losses from charging and discharging transistors. Additionally, control via electric fields instead of currents allows for a drastic reduction in power consumption and an increase in operating frequency," says Dr. Michael Förtsch, CEO of Q.ANT.
A clear example is provided by the Fourier transform: while classical processors require millions of transistors, the same operation is performed in a single optical element of the NPU. Wavelength multiplexing allows multiple operations to run in parallel, opening up entirely new dimensions for AI inference and machine learning.
Communication via standard PCIe interface
The block diagram shows everything that's inside the NPU: from the hardware layer to the first software layer.
(Image: Q.ANT)
For developers already working with established frameworks like Keras, TensorFlow, or PyTorch, the question arises: How easy is the integration of the new technology? "The NPU uses the standard PCIe interface, so no fundamental changes to existing systems are necessary," explains Dr. Förtsch. "With our Q.ANT Toolkit, developers can simply recompile their existing source code to utilize photonic computing."
The toolkit will provide basic operations for AI inference. Integration into one of the major AI frameworks is planned, leading to seamless integration and compatibility with industry standards like Keras, PyTorch, or TensorFlow.
End customers do not need to reprogram their source code to be compatible with the technology. "The source code simply needs to be recompiled with our compiler extension. That is our philosophy, and we believe that this way we can make photonic computing a part of this chip ecosystem," says Förtsch.
In the first version, the toolkit enables basic AI inference operations. A comprehensive integration into established frameworks is being prepared to ensure full compatibility with common industry standards.
The initial benchmark results are impressive.
The NPU shows impressive results in benchmark tests: In experiments with MNIST datasets, the number of parameters could be reduced by 43 percent and the necessary operations by 46 percent—without loss of accuracy. Simulations of larger models, such as GPT architectures, also show promising results.
"Our photonic design scales better than classical processors, especially with large problems. The energy savings in AI inference is theoretically one thirtieth of what current technologies require," says Förtsch.
The impacts of photonic computing go far beyond energy savings. Especially in neural networks, activation functions, traditionally a complex issue, could be significantly optimized by the photonic architecture. The use of analog functions like sine and exponential accelerates training and reduces computational effort.
For the first time, we can train networks that would not be feasible with digital GPUs. This opens up entirely new possibilities, especially in high-performance computing.
Dr. Michael Förtsch, CEO of Q.ANT
Train neural networks more efficiently
Photonic computing opens up entirely new potentials. "For the first time, one can train networks that are completely different from what a GPU can ever do," says Förtsch. In AI networks, the activation function is the most underestimated tool. It is a central component of neural networks in AI. It determines how the input data is processed in an artificial neuron and significantly influences whether and in what form the data is passed on to the next layer of the neural network.
"In neural networks, there are input parameters and network depths. Between the layers are the activation functions. In the digital world, it is very complex to implement these functions in a parametrizable way in a digital circuit. In the photonic world, these functions can be a path to significantly more efficient training in the AI context. Through the analog computer (the NPU), we have access to sine, cosine, and exponential functions. The correct use of these new possibilities can lead to a significant reduction in parameters and thus to increased efficiency and acceleration in the training of neural networks. Our current focus is on HPC data centers. Here, in the future, dedicated compute nodes with NPUs (and GPUs) will be available, using the optimal hardware for each application," Förtsch concludes. (heh)
Date: 08.12.2025
Naturally, we always handle your personal data responsibly. Any personal data we receive from you is processed in accordance with applicable data protection legislation. For detailed information please see our privacy policy.
Consent to the use of data for promotional purposes
I hereby consent to Vogel Communications Group GmbH & Co. KG, Max-Planck-Str. 7-9, 97082 Würzburg including any affiliated companies according to §§ 15 et seq. AktG (hereafter: Vogel Communications Group) using my e-mail address to send editorial newsletters. A list of all affiliated companies can be found here
Newsletter content may include all products and services of any companies mentioned above, including for example specialist journals and books, events and fairs as well as event-related products and services, print and digital media offers and services such as additional (editorial) newsletters, raffles, lead campaigns, market research both online and offline, specialist webportals and e-learning offers. In case my personal telephone number has also been collected, it may be used for offers of aforementioned products, for services of the companies mentioned above, and market research purposes.
Additionally, my consent also includes the processing of my email address and telephone number for data matching for marketing purposes with select advertising partners such as LinkedIn, Google, and Meta. For this, Vogel Communications Group may transmit said data in hashed form to the advertising partners who then use said data to determine whether I am also a member of the mentioned advertising partner portals. Vogel Communications Group uses this feature for the purposes of re-targeting (up-selling, cross-selling, and customer loyalty), generating so-called look-alike audiences for acquisition of new customers, and as basis for exclusion for on-going advertising campaigns. Further information can be found in section “data matching for marketing purposes”.
In case I access protected data on Internet portals of Vogel Communications Group including any affiliated companies according to §§ 15 et seq. AktG, I need to provide further data in order to register for the access to such content. In return for this free access to editorial content, my data may be used in accordance with this consent for the purposes stated here. This does not apply to data matching for marketing purposes.
Right of revocation
I understand that I can revoke my consent at will. My revocation does not change the lawfulness of data processing that was conducted based on my consent leading up to my revocation. One option to declare my revocation is to use the contact form found at https://contact.vogel.de. In case I no longer wish to receive certain newsletters, I have subscribed to, I can also click on the unsubscribe link included at the end of a newsletter. Further information regarding my right of revocation and the implementation of it as well as the consequences of my revocation can be found in the data protection declaration, section editorial newsletter.