Universal Robots and Scale AI launch the industry's first direct lab-to-factory solution for AI model training of robots at NVIDIA's GTC 2026.
The new UR AI Trainer, developed by Universal Robots and Scale AI, is the first direct lab-to-factory solution for AI model training.
(Source: Universal Robots)
Universal Robots (UR) unveiled the UR AI Trainer at GTC 2026 in Silicon Valley. Developed in collaboration with Scale AI, the AI Trainer marks a tectonic shift as robots move from pre-programmed applications to fully AI-driven tasks. These systems are powered by robust data generated in AI training cells where robots imitate humans.
“Our customers, ranging from large enterprises to AI research labs, are no longer just asking for AI features,” said Anders Beck, VP of AI Robotics Products at Universal Robots. “They need a way to collect high-fidelity, synchronized robot and vision data to train AI models on the same robots they intend to deploy. Our AI Trainer is the industry's first direct lab-to-factory solution for AI model training.”
Alongside the new AI Trainer, Universal Robots’ GTC booth will showcase a state-of-the-art robotic foundation model from Generalist AI, a UR preferred model partner. Leveraging this model, two UR robots will complete a complex smartphone packaging task, previously impossible without recent advances in the field of Physical AI.
Enabling AI-Ready Data Capture With Force Feedback and Direct Torque Control
AI robotics training is often hindered by fragmented hardware and low-fidelity data capture. Much of today’s training data is collected on research robots not suited for production environments, and many systems rely only on visual feedback, making delicate or contact-rich tasks difficult. “The AI Trainer directly addresses these barriers,” said Beck. “By utilizing our unique Direct Torque Control and force feedback features, we give developers direct influence over how the robot physically interacts with the world, training on the same robust hardware used in over 100,000 industrial deployments.”
Scale AI Partnership Enables a Flywheel of Integrated Robotics Data
The AI Trainer allows human operators to guide UR robots through tasks in a leader-follower setup while automatically capturing high-quality multimodal data for robotics AI development. Operators physically guide a “leader” robot through a task while a synchronized “follower” robot mirrors the motion in real time. During each demonstration, the system records synchronized motion, force, and visual data, producing the structured datasets required to train Vision-Language-Action (VLA).
Deploying on UR’s AI Accelerator platform, the UR AI Trainer combines UR robots with Scale AI software to enable data capture on UR robots in production and at scale creating continuous feedback that drives ongoing optimization of physical AI systems.
"Universal Robots is a leader in industrial robotics, and its global footprint offers the ideal foundation for data capture and AI deployment,” said Ben Levin, General Manager, Physical AI at Scale AI.
Together, we’ve created an integrated robotics data flywheel, allowing customers to train, deploy, and improve their AI models faster than ever before.
Ben Levin
As part of this collaboration, UR and Scale AI will release a large-scale industrial dataset collected on UR robots later this year.
Firsthand Encounters With AI Trainer at GTC
With GTC as the official launch pad, attendees will be able to experience the system first-hand at UR’s booth as they guide two UR3e ‘leader’ robots providing haptic input to control two UR7e ‘follower’ robots. The setup enables visitors to perform an advanced smartphone packaging task with haptic feedback for imitation learning and VLA training, with demonstration data recorded in real time on Scale’s stack and replayable directly on the AI Trainer.
The process of capturing robot training data for AI models is further showcased through a demo that illustrates the same smartphone packaging task – just trained virtually: Built in NVIDIA Omniverse and leveraging Isaac Sim, the simulated setup allows attendees to control a virtual bi-manual UR3e system with real-time haptic feedback using two Haply Inverse3 devices as ‘leaders’, providing a physics-accurate simulation.
Universal Robots is also exploring the use of the NVIDIA Physical AI Data Factory Blueprint to automate and scale its synthetic data generation, transforming world-scale compute into a production engine for high-quality robotic training data.
"The shift toward Physical AI requires a fundamental move from rigid, pre-programmed automation to generalist robots that can perceive, reason, and learn through human-like interaction," said Amit Goel, head of robotics and edge AI ecosystem at NVIDIA. "By leveraging the NVIDIA Isaac simulation frameworks, Universal Robots is building a scalable engine for high-fidelity data capture and generation, providing the essential infrastructure to train the next generation of autonomous systems at scale."
Date: 08.12.2025
Naturally, we always handle your personal data responsibly. Any personal data we receive from you is processed in accordance with applicable data protection legislation. For detailed information please see our privacy policy.
Consent to the use of data for promotional purposes
I hereby consent to Vogel Communications Group GmbH & Co. KG, Max-Planck-Str. 7-9, 97082 Würzburg including any affiliated companies according to §§ 15 et seq. AktG (hereafter: Vogel Communications Group) using my e-mail address to send editorial newsletters. A list of all affiliated companies can be found here
Newsletter content may include all products and services of any companies mentioned above, including for example specialist journals and books, events and fairs as well as event-related products and services, print and digital media offers and services such as additional (editorial) newsletters, raffles, lead campaigns, market research both online and offline, specialist webportals and e-learning offers. In case my personal telephone number has also been collected, it may be used for offers of aforementioned products, for services of the companies mentioned above, and market research purposes.
Additionally, my consent also includes the processing of my email address and telephone number for data matching for marketing purposes with select advertising partners such as LinkedIn, Google, and Meta. For this, Vogel Communications Group may transmit said data in hashed form to the advertising partners who then use said data to determine whether I am also a member of the mentioned advertising partner portals. Vogel Communications Group uses this feature for the purposes of re-targeting (up-selling, cross-selling, and customer loyalty), generating so-called look-alike audiences for acquisition of new customers, and as basis for exclusion for on-going advertising campaigns. Further information can be found in section “data matching for marketing purposes”.
In case I access protected data on Internet portals of Vogel Communications Group including any affiliated companies according to §§ 15 et seq. AktG, I need to provide further data in order to register for the access to such content. In return for this free access to editorial content, my data may be used in accordance with this consent for the purposes stated here. This does not apply to data matching for marketing purposes.
Right of revocation
I understand that I can revoke my consent at will. My revocation does not change the lawfulness of data processing that was conducted based on my consent leading up to my revocation. One option to declare my revocation is to use the contact form found at https://contact.vogel.de. In case I no longer wish to receive certain newsletters, I have subscribed to, I can also click on the unsubscribe link included at the end of a newsletter. Further information regarding my right of revocation and the implementation of it as well as the consequences of my revocation can be found in the data protection declaration, section editorial newsletter.
Generalist AI Demonstrates Real-World Robotic Foundation Model Performance
Complementing the two data-capture demonstrations, Generalist’s showcase highlights how advances in data collection and AI models translate into real-world robotic performance. In the first public demonstration of Generalist’s embodied foundation models, two UR7e robots autonomously execute a complex smartphone packaging task, demonstrating dexterity, coordination, and contact-rich manipulation in a real-world environment. The demonstration shows how scaled, high-quality training data combined with frontier model architectures can enable robust physical AI systems beyond the lab.
“Generalist is building embodied foundation models that deliver industry-leading dexterity and reliability,” said Pete Florence, co-founder and CEO of Generalist AI. “This demonstration on Universal Robots’ trusted industrial platform shows how physical commonsense can be translated into real-world capability, paving the way for deployment across industries at scale.”
“The adoption of our technology by the pioneers of AI model training and data capture underscores why Universal Robots has become the preferred platform for physical AI,” said UR’s Anders Beck.