Humanoid and cobot developers can use NVIDIA’s full-stack safety platform. Source: NVIDIA
At its GPU Technology Conference, or GTC in Paris today, NVIDIA Corp. announced new tools for applying artificial intelligence to the development of robots, self-driving cars, and smart cities. The company claimed that European manufacturers are racing to reinvent their processes to become more software-defined and AI-driven because of labor shortages and an emphasis on sustainability.

“Now is the era for physical AI,” said Rev Lebaredian, vice president for Omniverse and simulation technology at NVIDIA. “In the next five years, we’ll have a labor gap of 50 million people. With onshoring and reshoring of manufacturing, more factories and warehouses will have to be built faster than ever.”
NVIDIA Isaac GR00T N1.5, an open foundation model for humanoid robot reasoning and skills, is now available for download on Hugging Face. This update enhances the model’s adaptability and ability to follow instructions, improving its performance in material handling and manufacturing tasks.
The NVIDIA Isaac Sim 5.0 and Isaac Lab 2.2 open-source robotics simulation and learning frameworks, optimized for NVIDIA RTX PRO 6000 systems, are available on GitHub for developer preview.
European ecosystem builds on NVIDIA’s three computers
NVIDIA said Europe‘s leading robotics developers and providers are integrating its Isaac robotics platform to train, simulate, and deploy robots across different embodiments. It said they are following its three-computer approach, which includes:
NVIDIA DGX systems and GPUs for training AI models and developing AV software
The NVIDIA Omniverse and Cosmos platforms running on OVX systems for simulation and synthetic data generation, enabling the testing and validation of autonomous driving scenarios and optimization of smart factory operations
The automotive-grade NVIDIA DRIVE AGX in-vehicle computer for processing real-time sensor data for safe autonomous driving
Lebaredian said that all of NVIDIA’s computers can run all of its software, but the architectures are sized differently and will be tuned differently for optimal performance with different form factors, from industrial robots and self-driving cars to humanoids.
Numerous robotics companies are using NVIDIA’s hardware and software to make their systems smarter and safer, and several are exhibiting in Paris this week. For instance, Agile Robots is post-training the GR00T N1 model in Isaac Lab to train its dual-arm manipulator robots, which run on NVIDIA Jetson hardware to execute a variety of tasks in industrial environments.
Meanwhile, idealworks has adopted the Mega NVIDIA Omniverse Blueprint for robotic fleet simulation to extend the blueprint’s capabilities to humanoids. Building on the VDA 5050 framework, idealworks is helping to develop guidance that supports tasks such as picking, moving, and placing objects.
Neura Robotics is integrating the NVIDIA Isaac platform to enhance its robot development workflows. The company is using GR00T-Mimic to post-train the Isaac GR00T N1 robot foundation model for its MiPA service robot.
Neura is also collaborating with SAP and NVIDIA to integrate SAP’s Joule agents with its robots, using the Mega NVIDIA Omniverse blueprint to simulate and refine robot behavior in complex, realistic operational scenarios before deployment.
Vorwerk is post-training GR00T N1 models in Isaac Lab with its custom synthetic data pipeline, which is built on Isaac GR00T-Mimic and powered by the NVIDIA Omniverse platform. The enhanced models are then deployed on the NVIDIA Jetson AGX, Jetson Orin, or Jetson Thor modules for advanced, real-time home robotics.
Humanoid is using NVIDIA’s full robotics stack, including Isaac Sim and Isaac Lab, to cut its prototyping time down by six weeks. The company is training its vision language action (VLA) models on NVIDIA DGX B200 systems to boost the cognitive abilities of its robots, allowing them to operate autonomously in complex environments using Jetson Thor onboard computing.
Universal Robots has introduced UR15, its fastest collaborative robot yet, to the European market. Using UR’s AI Accelerator — developed on the NVIDIA Isaac platform’s CUDA-accelerated libraries and AI models, as well as Jetson AGX Orin — manufacturers can build AI applications to embed intelligence into the new cobots.
Wandelbots‘ NOVA Operating System is now integrated with Omniverse to simulate, validate, and optimize robotic behaviors virtually before they are deployed to physical robots. Wandelbots also collaborated with EY and EDAG to offer manufacturers a scalable automation platform on Omniverse that speeds up the transition from proof of concept to full-scale deployment.
Extend Robotics is using the Isaac GR00T platform to enable customers to control and train robots for industrial tasks like visual inspection and handling radioactive materials. The company’s Advanced Mechanics Assistance System lets users collect demonstration data and generate diverse synthetic datasets with NVIDIA GR00T-Mimic and GR00T-Gen to train the GR00T N1 foundation model.
SICK is integrating new certified sensor models — as well as 2D and 3D lidars, safety scanners and cameras — into NVIDIA Isaac Sim. This enables engineers to virtually design, test and validate machines using SICK’s perception models within Omniverse, supporting processes from product development to large-scale robotic fleet management.
Toyota Material Handling is working with SoftServe to simulate its autonomous mobile robots (AMRs) working alongside human workers, using the Mega NVIDIA Omniverse Blueprint. Toyota is testing and simulating traffic scenarios to refine its AI algorithms before real-world deployment.
Halos provides safety framework for AVs, robots
NVIDIA announced that NVIDIA Halos — a comprehensive safety system that unifies hardware architecture, AI models, software, tools and services — has expanded from autonomous vehicles (AVs) to the entire development lifecycle of AI-driven robots. Halos also addresses technologies such as algorithms, time for deployment, and computation for simulation and training, providing “guardrails” at different development stages.
The NVIDIA Halos AI Systems Inspection Lab has earned accreditation from the ANSI National Accreditation Board (ANAB) to perform inspections across functional safety.
“NVIDIA’s latest evaluation with ANAB verifies the demonstration of competence and compliance with internationally recognized standards, helping ensure that developers of autonomous machines — from automotive to robotics — can meet the highest benchmarks for functional safety,” stated R. Douglas Leonard Jr., executive director of ANAB.
To support robotics leaders in strengthening safety across the entire robotics development lifecycle, Halos provides:
Safety extension packages for the NVIDIA IGX platform, enabling manufacturers to easily program safety functions into their robots, supported by TUV Rheinland’s inspection of NVIDIA IGX
Robotic safety platform, which includes IGX and the NVIDIA Holoscan Sensor Bridge for a unified approach to designing sensor-to-compute architecture with built-in AI safety.
Outside-in safety AI inspector, an AI-powered agent for monitoring robot operations, helping improve worker safety.
Arcbest, Advantech, Bluewhite, Boston Dynamics, FORT, Inxpect, KION, NexCobot — a NEXCOM company, SICK, and Synapticon were among the first robotics companies to join the Halos Inspection Lab and integrate their products with NVIDIA’s safety and cybersecurity requirements.
“Autonomy is a journey,” said Lebaredian. “Our customers start with building a virtual facility in Omniverse to use for planning and processing optimization. Then, they can use the same digital twin to train, test, and simulate robots.”
“Some of our first partners are in the next stage, with BMW simulating fleets of mixed robots and Mercedes-Benz using their virtual factory to test humanoid deployment,” he explained. “New customers like Toyota and Schaeffler are just beginning their Omniverse journeys.”
Now accepting session submissions!
Siemens partners for AI in manufacturing
Siemens and NVIDIA have expanded their partnership to further integrate Siemens’ Xcelerator marketplace with NVIDIA’s Omniverse, as well as with generative AI and robotics. The companies said this will enable visualization and more realistic digital twins for more informed decisions.
“This is a really exciting partnership, bringing together NVIDIA’s full-stack AI computing infrastructure and Siemens’ massive reach into the $16 trillion global manufacturing market,” noted Lebaredian during a press briefing.
Siemens’ Industrial Copilot for Operations brings generative AI to shop-floor operators and will be optimized to run on premises with NVIDIA RTX PRO 6000 Blackwell Server Edition GPUs. The Operations Copilot relies on NVIDIA NeMo microservices and the NVIDIA AI Blueprint for video search and summarization to deliver real-time, AI-powered assistance for shop-floor operations, saving 30% of reactive maintenance time.
In addition, a new line of Siemens Industrial PCs certified for NVIDIA GPUs can withstand heat, dust and vibration, allowing for 24/7 operation. They enable complex robotics tasks — from quality inspection to predictive maintenance — and can deliver a 25x acceleration in AI execution, said the partners.
NVIDIA improves simulation capabilities
DiffusionRenderer, another NVIDIA research project, used an AI process called neural rendering to approximate how light behaves in the real world. The company said DiffusionRenderer provides a framework for video lighting control, editing, and synthetic data augmentation, making it a useful for physical AI development and creative industries such as video game design.
NVIDIA described DiffusionRenderer as “an AI light switch for videos that can turn daytime scenes into nightscapes, transform sunny afternoons to cloudy days, and tone down harsh fluorescent lighting into soft, natural illumination.”
“To help supercharge industrial AI prevalence in Europe, we’re building the world’s first industrial AI cloud in Germany in partnership with Deutsche Telekom,” Lebaredian added. “The build-out will start with 10,000 RTX Pro and B200 GPUs, marking this is Germany’s largest AI factory deployment to date and the first step in the German federal government’s project to build up the country’s sovereign AI infrastructure.”
The new industrial AI factory in Germany will feature NVIDIA DGX B200 and NVIDIA RTX PRO servers. It is intended to enable European industrial leaders to accelerate every manufacturing application, from engineering and simulation to factory digital twins and robotics. They will run NVIDIA CUDA-X libraries, RTX, and Omniverse-accelerated workloads from leading software providers such as Siemens, Ansys, Cadence, and Rescale.
The company said it is working with major European manufacturers on everything from product design and factory planning to AI-driven operations and logistics.
NVIDIA ready for autonomous vehicles, smart cities
NVIDIA said its Drive software for AVs is now in full production. Halos includes NVIDIA DriveOS, a certified framework intended to provide a reliable foundation for safe vehicle operation while meeting stringent automotive standards.
In addition, NVIDIA released NVIDIA Cosmos Predict-2, a new world foundation model with improved future world state prediction capabilities for high-quality synthetic data generation. The company said the new model can accelerate training of AVs with its contextual understanding of visual inputs and text, leading to fewer “hallucinations.”
The latest release of CARLA, a leading open-source AV simulator, has integrated the Cosmos Transfer application programming interfaces (APIs) and the NVIDIA NuRec tools for neural reconstruction and rendering. The company said CARLA’s user base of more than 150,000 AV developers can now render synthetic simulation scenes and viewpoints with high fidelity, as well as generate endless variations of lighting, weather, and terrain using simple prompts.
“The big difference between the stack for autonomous driving and robotics is the level of maturity,” said Lebaredian. “Autonomous driving has been in production at scale for a longer time than general robotics, but it’s moving pretty quickly. Everything we talked about with Halos in terms of redundancy and building in safety at level of the stack, it’s very similar.”
GTRS model architecture showing a unified system for generating and scoring diverse driving
trajectories using diffusion- and vocabulary-based trajectories. Source: NVIDIA
The company also touted its win of the Autonomous Grand Challenge at the Computer Vision and Pattern Recognition (CVPR) conference in Nashville this week.
Not only did NVIDIA present more than 60 papers, but its AV Applied Research Team presented the Generalized Trajectory Scoring (GTRS) method. GTRS generates a variety of trajectories and progressively filters out the best one. IT could help generate safer and more adaptive driving trajectories, said NVIDIA.
NVIDIA Omniverse Blueprint for smart city AI is a reference framework that combines the company’s simulation and AI platforms so developers can build simulation-ready or SimReady photorealistic digital digital twins of entire cities to build and test AI agents that can help monitor and optimize municipal operations.
“Europe led the first two industrial revolutions,” Lebaredian observed. “And now, in the era of AI, a new industry is blooming: robotics. With such a rich history in mechatronics and industrial craft, European robot makers are emerging as a global force in autonomy.”
GIPHY App Key not set. Please check settings