Automated guided vehicles (AGVs) and mobile, wheel-driven robots currently dominate production halls and warehouses. However, as requirements increase, it is becoming clear that wheels alone are no longer enough. The demand for machines that can move in more complex, human-oriented environments is growing.
Humanoid robots can support people without adapting the environment.
In hospitals, restaurants, private homes, outdoors, but also in many warehouses and production halls, perfectly flat floors and open spaces without steps or obstacles are a rarity. Instead, robots have to overcome thresholds, avoid unpredictable obstacles and adapt spontaneously to a world that is unpredictable and was not modeled for robots from the start.
Humanoid robots are the logical answer to these challenges. This is because they are based on the human model. Three core areas are currently determining the development for demanding application environments: Motion control, environmental perception and navigation, and modularity and flexibility. Together, innovations in these areas are driving the spread of robots beyond strictly structured industrial environments, making them truly versatile autonomous systems.
Precise Movement Control in Unstructured Environments
Conventional industrial robots carry out pre-programmed movements in known working environments, such as three-axis robot arms or gantry systems permanently installed on rails. In contrast, walking and humanoid robots require real-time control in closed control loops over dozens of degrees of freedom. Every step and every joint movement must be executed within milliseconds in such a way that stability, torque and posture are balanced.
Modern robot systems are thus moving away from the classic concept of central motor control. Each joint or limb has its own microcontroller, which is responsible for extremely fast torque and position control loops. A central processing unit only coordinates the whole body movement. This distributed control reduces communication delays and ensures smoother, more stable movement sequences, which are crucial, for example, when a robot has to balance itself after a collision.
The coordinated control of several motors is based on real-time fieldbus protocols such as EtherCAT. These ensure synchronization in the submillisecond range. The new standards include OPC UA FX via TSN, which uses time-sensitive networking to further improve the reliable, low-latency communication required for industrial automation and advanced robotics. An advantage that is evident in practical tests of two- and four-legged robots on uneven terrain, where precise timing prevents serious missteps.
Beyond the conventional control level, AI-supported motion planners predict shifts in the center of gravity and adjust the target joints in real time. This blurs the boundaries between motion planning and motor control, integrating sensory feedback into each individual step.
Perception and Navigation in Complex Environments
In traditional warehouses, several 2D LiDARs and cameras with QR code recognition are often sufficient. In unstructured, human-dominated environments, however, robots require much deeper and more accurate situational awareness.
Walking and humanoid robotic systems combine 3D LiDAR, time-of-flight (ToF) cameras and stereo vision to create three-dimensional maps in real time. Simultaneous Localization and Mapping (SLAM) algorithms fuse this sensor data with measured values from the Inertial Measurement Unit (IMU) to maintain precise orientation even in poor lighting or restricted visibility, such as behind curtains in a hospital.
While traditional collision avoidance treats all obstacles equally, modern systems use edge AI to differentiate between moving people, pets and furnishings. A cleaning robot, for example, pauses when it detects an obstacle and resumes its route as soon as the path is clear. This minimizes interruptions in dynamic environments.
In restaurants or care facilities, robots must be able to grip and transfer objects. Grip planners combine 3D point clouds with learned object models to determine stable grip points. In a simulated task in a kitchen, a humanoid robot achieved a success rate of 92 percent by combining visual recognition with force sensor feedback.
Furthermore, mixed human-robot environments require safety standards and redundant sensor technology. Zonal LiDAR systems create safety zones around moving machine parts and stop movements as soon as people get too close. This is particularly useful in the healthcare and hospitality sectors.
Date: 08.12.2025
Naturally, we always handle your personal data responsibly. Any personal data we receive from you is processed in accordance with applicable data protection legislation. For detailed information please see our privacy policy.
Consent to the use of data for promotional purposes
I hereby consent to Vogel Communications Group GmbH & Co. KG, Max-Planck-Str. 7-9, 97082 Würzburg including any affiliated companies according to §§ 15 et seq. AktG (hereafter: Vogel Communications Group) using my e-mail address to send editorial newsletters. A list of all affiliated companies can be found here
Newsletter content may include all products and services of any companies mentioned above, including for example specialist journals and books, events and fairs as well as event-related products and services, print and digital media offers and services such as additional (editorial) newsletters, raffles, lead campaigns, market research both online and offline, specialist webportals and e-learning offers. In case my personal telephone number has also been collected, it may be used for offers of aforementioned products, for services of the companies mentioned above, and market research purposes.
Additionally, my consent also includes the processing of my email address and telephone number for data matching for marketing purposes with select advertising partners such as LinkedIn, Google, and Meta. For this, Vogel Communications Group may transmit said data in hashed form to the advertising partners who then use said data to determine whether I am also a member of the mentioned advertising partner portals. Vogel Communications Group uses this feature for the purposes of re-targeting (up-selling, cross-selling, and customer loyalty), generating so-called look-alike audiences for acquisition of new customers, and as basis for exclusion for on-going advertising campaigns. Further information can be found in section “data matching for marketing purposes”.
In case I access protected data on Internet portals of Vogel Communications Group including any affiliated companies according to §§ 15 et seq. AktG, I need to provide further data in order to register for the access to such content. In return for this free access to editorial content, my data may be used in accordance with this consent for the purposes stated here. This does not apply to data matching for marketing purposes.
Right of revocation
I understand that I can revoke my consent at will. My revocation does not change the lawfulness of data processing that was conducted based on my consent leading up to my revocation. One option to declare my revocation is to use the contact form found at https://contact.vogel.de. In case I no longer wish to receive certain newsletters, I have subscribed to, I can also click on the unsubscribe link included at the end of a newsletter. Further information regarding my right of revocation and the implementation of it as well as the consequences of my revocation can be found in the data protection declaration, section editorial newsletter.
Modularity and Flexibility for Fast Implementation
Humanoid robots must fit seamlessly into existing infrastructures in all requirement scenarios and support rapid adaptation to changing tasks. In the past, a central computing system handled all SLAM, image processing and control tasks. Today, the computing power is preferably distributed to the peripherals:
NPUs (Neural Processing Units) near the cameras for object recognition,
Microcontroller for real-time processing of IMU data,
Multicore processors for higher-level motion planning.
This distributed approach significantly reduces both overall energy consumption and material costs while ensuring optimum performance. The open framework ROS 2 (Robot Operating System 2) facilitates integration by providing a hardware-independent platform for communication, life cycle management and real-time control. Support for DDS (Data Distribution Service) enables fast, reliable communication between sensor, planning and action modules.
Whether WLAN in hospitals or LTE or 5G connections in outdoor areas, robots require flexible network architectures. Matter-based protocols could soon standardize connectivity in the smart home, while edge-to-cloud data pipelines provide information to industrial analysis platforms for fleet management and predictive maintenance.
In addition, battery management systems optimize runtime through dynamic power adjustment and condition diagnostics, maximizing uptime between charge cycles. On the drive side, crossover microcontrollers take over closed-loop control of the motors and support fieldbus interfaces for each axis or link. This allows robot platforms to be flexibly scaled from four to thirty drives without having to redevelop the central electronics.
On the Way to Everyday Life
Walking and humanoid robots are no longer science fiction. Initial practical applications in hospitals are showing promising results: they deliver medication, guide visitors through the building or disinfect rooms. In restaurants, robots take orders or clear away dishes, while household robots move independently through cluttered living rooms to perform everyday tasks.
The close integration of control, perception and modular architecture is crucial for widespread use. Three key findings characterize the current state of development:
Leg and multi-leg platforms are superior wherever wheels slip, sink or block: on grass, uneven ground or in narrow corridors.
Multimodal sensor fusion of LiDAR, vision and inertial sensors is essential to navigate safely and effectively in human-centered spaces.
Modular computing architectures and standardized middleware shorten development cycles and promote innovation.
The combination of advanced motion control, improved perception and flexible hardware architecture is paving the way for a new generation of service robots that move effortlessly among humans and adapt to the unpredictability of everyday life. What once sounded like a dream of the future is now becoming reality - step by step.