Universal Robots and Scale AI Launch Imitation Learning System to Accelerate AI Model Training, Bridging the ‘Lab-to-Factory’ Gap
Key Terms
direct torque control technical
foundation model technical
haptic feedback technical
nvidia omniverse technical
physical ai technical

The new UR AI Trainer, developed by Universal Robots and Scale AI, is the first direct lab-to-factory solution for AI model training.
“Our customers, ranging from large enterprises to AI research labs, are no longer just asking for AI features,” said Anders Beck, VP of AI Robotics Products at Universal Robots. “They need a way to collect high-fidelity, synchronized robot and vision data to train AI models on the same robots they intend to deploy. Our AI Trainer is the industry's first direct lab-to-factory solution for AI model training.”
Alongside the new AI Trainer, Universal Robots’ GTC booth will showcase a state-of-the-art robotic foundation model from Generalist AI, a UR preferred model partner. Leveraging this model, two UR robots will complete a complex smartphone packaging task, previously impossible without recent advances in the field of Physical AI.
Enabling AI-ready data capture with force feedback and direct torque control
AI robotics training is often hindered by fragmented hardware and low-fidelity data capture. Much of today’s training data is collected on research robots not suited for production environments, and many systems rely only on visual feedback, making delicate or contact-rich tasks difficult. “The AI Trainer directly addresses these barriers,” said Beck. “By utilizing our unique Direct Torque Control and force feedback features, we give developers direct influence over how the robot physically interacts with the world, training on the same robust hardware used in over 100,000 industrial deployments.”
Scale AI partnership enables a flywheel of integrated robotics data
The AI Trainer allows human operators to guide UR robots through tasks in a leader-follower setup while automatically capturing high-quality multimodal data for robotics AI development. Operators physically guide a “leader” robot through a task while a synchronized “follower” robot mirrors the motion in real time. During each demonstration, the system records synchronized motion, force, and visual data, producing the structured datasets required to train Vision-Language-Action (VLA).
Deploying on UR’s AI Accelerator platform, the UR AI Trainer combines UR robots with Scale AI software to enable data capture on UR robots in production and at scale creating continuous feedback that drives ongoing optimization of physical AI systems.
"Universal Robots is a leader in industrial robotics, and its global footprint offers the ideal foundation for data capture and AI deployment,” said Ben Levin, General Manager, Physical AI at Scale AI. “Together, we’ve created an integrated robotics data flywheel, allowing customers to train, deploy, and improve their AI models faster than ever before.” As part of this collaboration, UR and Scale AI will release a large-scale industrial dataset collected on UR robots later this year.
First-hand encounters with AI Trainer at GTC
With GTC as the official launch pad, attendees will be able to experience the system first-hand at UR’s booth as they guide two UR3e ‘leader’ robots providing haptic input to control two UR7e ‘follower’ robots. The setup enables visitors to perform an advanced smartphone packaging task with haptic feedback for imitation learning and VLA training, with demonstration data recorded in real time on Scale’s stack and replayable directly on the AI Trainer.
The process of capturing robot training data for AI models is further showcased through a demo that illustrates the same smartphone packaging task – just trained virtually: Built in NVIDIA Omniverse and leveraging Isaac Sim, the simulated setup allows attendees to control a virtual bi-manual UR3e system with real-time haptic feedback using two Haply Inverse3 devices as ‘leaders’, providing a physics-accurate simulation.
Universal Robots is also exploring the use of the NVIDIA Physical AI Data Factory Blueprint to automate and scale its synthetic data generation, transforming world-scale compute into a production engine for high-quality robotic training data.
"The shift toward Physical AI requires a fundamental move from rigid, pre-programmed automation to generalist robots that can perceive, reason, and learn through human-like interaction," said Amit Goel, head of robotics and edge AI ecosystem at NVIDIA. "By leveraging the NVIDIA Isaac simulation frameworks, Universal Robots is building a scalable engine for high-fidelity data capture and generation, providing the essential infrastructure to train the next generation of autonomous systems at scale."
Generalist AI demonstrates real-world robotic foundation model performance
Complementing the two data-capture demonstrations, Generalist’s showcase highlights how advances in data collection and AI models translate into real-world robotic performance. In the first public demonstration of Generalist’s embodied foundation models, two UR7e robots autonomously execute a complex smartphone packaging task, demonstrating dexterity, coordination, and contact-rich manipulation in a real-world environment. The demonstration shows how scaled, high-quality training data combined with frontier model architectures can enable robust physical AI systems beyond the lab.
“Generalist is building embodied foundation models that deliver industry-leading dexterity and reliability,” said Pete Florence, co-founder and CEO of Generalist AI. “This demonstration on Universal Robots’ trusted industrial platform shows how physical commonsense can be translated into real-world capability, paving the way for deployment across industries at scale.”
“The adoption of our technology by the pioneers of AI model training and data capture underscores why Universal Robots has become the preferred platform for physical AI,” said UR’s Anders Beck, who will share his expertise on a featured panel at GTC titled "Beyond the Workcell: Scaling Robotics Workflows Across the Factory Floor,” (Wednesday, March 18, 11:00 a.m.).
Download Media Kit: Here
About Universal Robots
Universal Robots is a global leader in collaborative robotics (cobots), used across a wide range of industries. Our mission is simple: Automation for anyone. Anywhere. With over 100,000 cobots sold worldwide, our user-friendly platform is supported by intuitive PolyScope software, award-winning training, comprehensive services, and the world’s largest cobot ecosystem, delivering innovation and choice to our customers. Universal Robots is part of Teradyne Robotics, a division of Teradyne (NASDAQ:TER), a leading supplier of automatic test equipment and advanced robotics technology.
www.universal-robots.com.
About Scale AI
Scale’s mission is to develop reliable AI systems for the world’s most important decisions. We provide the high-quality data that powers the world’s AI models, and we help enterprises and governments build, deploy, and oversee AI applications that create real impact. Through our research and Safety, Evaluations, and Alignment Lab (SEAL), we test models with rigorous benchmarks and novel research to help ensure AI is developed in ways people can trust. Founded in 2016, Scale is headquartered in
www.scale.com
View source version on businesswire.com: https://www.businesswire.com/news/home/20260316660072/en/
Media Contact
Mette McCall, McCall Media
+1 251 278 9847
mette@mccallmedia.net
Source: Universal Robots