AI Algorithms in Measurement Technology Improved 6G Receiver Design Based on Machine Learning

From Hendrik Härter | Translated by AI 2 min Reading Time

Related Vendors

Nokia Bell Labs and Rohde & Schwarz have jointly developed an AI-supported 6G receiver. Machine learning is used to achieve signal distortion compensation, which increases the uplink range by 10 to 25 %. This results in improvements in throughput and energy efficiency.

Nokia Bell Labs and Rohde & Schwarz have developed an improved 6G receiver design. This uses machine learning-based signal distortion compensation.(Image: freely licensed /  Pixabay)
Nokia Bell Labs and Rohde & Schwarz have developed an improved 6G receiver design. This uses machine learning-based signal distortion compensation.
(Image: freely licensed / Pixabay)

The development of 6G networks faces considerable technical challenges, particularly due to the planned use of higher frequency ranges, which lead to increased path losses and therefore coverage limitations. Nokia Bell Labs and Rohde & Schwarz have developed an approach that is based on machine learning and compensates for the physical limitations through clever signal processing directly in the receiver.

The core principle is based on AI algorithms that can identify and compensate for signal distortions in real time. The system performs adaptive filtering based on the current channel characteristics and simultaneously optimizes noise suppression and symbol decision. This signal reconstruction makes it possible to significantly improve spectral efficiency and make better use of the available bandwidth.

A Complex Test Setup

The developers have built a complex test setup for validation. The R&S SMW200A vector signal generator generates realistic uplink signals and emulates various channel conditions. The actual AI processing is performed by the newly introduced FSWX signal and spectrum analyzer from Rohde & Schwarz, which performs the AI inference for the Nokia receiver. This configuration makes it possible to realistically simulate 6G transmission conditions and test the algorithms under realistic conditions.

The measurement results show the improvements in performance: The uplink distance was increased by 10 to 25 % compared to conventional receivers. It is worth noting that the AI technology simultaneously optimizes several parameters and improves throughput and energy efficiency in addition to the increase in range. This leads to multiple benefits, which is of crucial importance for commercial implementations.

New Requirements and Opportunities

From a developer's perspective, this technology creates new requirements and opportunities. AI inference must be implemented in real time in the receiver front end, which requires special considerations regarding computing power and latency in signal processing. Developers need to find a careful balance between algorithm complexity and energy consumption, especially for mobile applications. At the same time, new opportunities are opening up for software-defined functionalities and adaptive systems.

The algorithm development itself poses particular challenges. The learning methods must be able to adapt to different channel conditions and at the same time be robust against changing environmental conditions. The development of an efficient training and inference pipeline for production environments requires careful optimization between accuracy and computational effort.

Integrate Into Existing 5G Infrastructures

Of particular interest to system developers is the fact that this technology makes it possible to build 6G networks on existing 5G sites. This reduces the requirements for additional cell density due to the improved receiver performance and can therefore significantly reduce rollout costs. Development is already taking place in the pre-standard phase, which gives developers the opportunity to gain experience early on and take compatibility aspects into account.

However, technical challenges remain. Latency management is critical, as AI processing must take place within strict 6G latency budgets. For mobile devices, ML algorithms must be optimized for energy efficiency without compromising performance. The standardization activities will significantly influence how this technology will prevail in practice. (heh)

Subscribe to the newsletter now

Don't Miss out on Our Best Content

By clicking on „Subscribe to Newsletter“ I agree to the processing and use of my data according to the consent form (please expand for details) and accept the Terms of Use. For more information, please see our Privacy Policy. The consent declaration relates, among other things, to the sending of editorial newsletters by email and to data matching for marketing purposes with selected advertising partners (e.g., LinkedIn, Google, Meta)

Unfold for details of your consent