NVIDIA Isaac accelerated libraries and AI models are being incorporated into the platforms of robotics firms.
NVIDIA and its robotics ecosystem partners announced generative AI tools, simulation, and perceptual workflows for Robot Operating System (ROS) developers at ROSCon in Odense, one of Denmark’s oldest cities and a center of automation.
New workflows and generative AI nodes for ROS developers deploying to the NVIDIA Jetson platform for edge AI and robotics were among the revelations. Robots can sense and comprehend their environment, interact with people in a natural way, and make adaptive decisions on their own with generative AI.
Generative AI Comes to ROS Community
ReMEmbR, which is based on ROS 2, improves robotic thinking and action using generative AI. Large language model (LLM), vision language models (VLMs), and retrieval-augmented generation are combined to enhance robot navigation and interaction with their surroundings by enabling the construction and querying of long-term semantic memories.
The WhisperTRT ROS 2 node powers the speech recognition feature. In order to provide low-latency inference on NVIDIA Jetson and enable responsive human-robot interaction, this node optimizes OpenAI’s Whisper model using NVIDIA TensorRT.
The NVIDIA Riva ASR-TTS service is used in the ROS 2 robots with voice control project to enable robots to comprehend and react to spoken commands. Using its Nebula-SPOT robot and the NVIDIA Nova Carter robot in NVIDIA Isaac Sim, the NASA Jet Propulsion Laboratory independently demonstrated ROSA, an AI-powered agent for ROS.
Canonical is using the NVIDIA Jetson Orin Nano system-on-module to demonstrate NanoOWL, a zero-shot object detection model, at ROSCon. Without depending on preset categories, it enables robots to recognize a wide variety of things in real time.
ROS 2 Nodes for Generative AI, which introduces NVIDIA Jetson-optimized LLMs and VLMs to improve robot capabilities, are available for developers to begin using right now.
Enhancing ROS Workflows With a ‘Sim-First’ Approach
Before being deployed, AI-enabled robots must be securely tested and validated through simulation. By simply connecting them to their ROS packages, ROS developers may test robots in a virtual environment with NVIDIA Isaac Sim, a robotics simulation platform based on OpenUSD. The end-to-end workflow for robot simulation and testing is demonstrated in a recently released Beginner’s Guide to ROS 2 Workflows With Isaac Sim.
As part of the NVIDIA Inception program for startups, Foxglove showcased an integration that uses Foxglove’s own extension, based on Isaac Sim, to assist developers in visualizing and debugging simulation data in real time.
New Capabilities for Isaac ROS 3.2
NVIDIA Isaac ROS is a collection of accelerated computing packages and AI models for robotics development that is based on the open-source ROS 2 software platform. The forthcoming 3.2 update improves environment mapping, robot perception, and manipulation.
New standard workflows that combine FoundationPose and cuMotion to speed up the creation of robotics pick-and-place and object-following pipelines are among the main enhancements to NVIDIA Isaac Manipulator.
Another is the NVIDIA Isaac Perceptor, which enhances the environmental awareness and performance of autonomous mobile robots (AMR) in dynamic environments like warehouses. It has a new visual SLAM reference procedure, improved multi-camera detection, and 3D reconstruction.
Partners Adopting NVIDIA Isaac
- AI models and NVIDIA Isaac accelerated libraries are being included into robotics firms’ platforms.
- To facilitate the creation of AI-powered cobot applications, Universal Robots, a Teradyne Robotics business, introduced a new AI Accelerator toolbox.
- Isaac ROS is being used by Miso Robotics to accelerate its Flippy Fry Station, a robotic french fry maker driven by AI, and to propel improvements in food service automation efficiency and precision.
- Using the Isaac Perceptor, Wheel.me is collaborating with RGo Robotics and NVIDIA to develop a production-ready AMR.
- Isaac Perceptor is being used by Main Street Autonomy to expedite sensor calibration.
For Isaac Perceptor, Orbbec unveiled their Perceptor Developer Kit, an unconventional AMR solution. - For better AMR navigation, LIPS Corporation has released a multi-camera perception devkit.
- For ROS developers, Canonical highlighted a fully certified Ubuntu environment that provides long-term support right out of the box.
Connecting With Partners at ROSCon
Connecting With Partners at ROSCon Canonical, Ekumen, Foxglove, Intrinsic, Open Navigation, Siemens, and Teradyne Robotics are among the ROS community members and partners who will be in Denmark to provide workshops, presentations, booth demos, and sessions. Highlights consist of:
- “Nav2 User Gathering” Observational meeting with Open Navigation LLC’s Steve Macenski.
- “ROS in Large-Scale Factory Automation” with Carsten Braunroth from Siemens AG and Michael Gentner from BMW AG
- “Incorporating AI into Workflows for Robot Manipulation” Birds of a Feather meeting with NVIDIA’s Kalyan Vadrevu
- “Speeding Up Robot Learning in Simulation at Scale” Birds of a Feather session with Macenski of Open Navigation and Markus Wuensch from NVIDIA on “On Use of Nav2 Docking”
- Furthermore, on Tuesday, October 22, in Odense, Denmark, Teradyne Robotics and NVIDIA will jointly organize a luncheon and evening reception.
- ROSCon is organized by the Open Source Robotics Foundation (OSRF). Open Robotics, the umbrella group encompassing OSRF and all of its projects, has the support of NVIDIA.