ROS2 Deployment#

COMPASS ships ROS2 nodes that consume a TensorRT engine and emit velocity commands. The same nodes cover three integration paths:

  • Isaac Sim — driving a simulated robot end-to-end via the ROS2 bridge. The detailed setup is on the Isaac Sim setup sub-page.

  • Sim2Real — deploying the same compass_inference node directly on a real robot, optionally pairing with visual SLAM (e.g. cuVSLAM) for state estimation.

  • Object navigation — wiring an object-localization module (e.g. Locate3D) into the bundled obj_target_generator node so the robot can approach named objects.

The packages live under ros2_deployment/.

Provided ROS2 nodes#

  • compass_inference — consumes camera images, target poses, and robot speed inputs, then outputs velocity commands by running TensorRT inference with COMPASS engines.

  • obj_target_generator — receives object localization bounding boxes and generates navigation target poses for the COMPASS navigator.

These nodes enable a variety of integration workflows; the three below are the bundled examples.

1. Isaac Sim integration#

The compass_navigator package is fully compatible with NVIDIA Isaac Sim, leveraging its robotics environments and ROS2 bridge for seamless simulation.

For detailed setup instructions and a step-by-step Isaac Sim integration guide, see the Isaac Sim setup sub-page.

2. Zero-shot sim2real transfer#

The compass_inference node can also be deployed directly on real robots, enabling a seamless transition from simulation to real-world operation. By integrating visual SLAM solutions such as cuVSLAM for robot state estimation, the COMPASS model can support zero-shot sim2real transfer.

3. Object navigation integration#

By integrating an object localization module (e.g., Locate3D), we can also enable object navigation with COMPASS. The obj_target_generator node can convert localized object bounding boxes into navigation goals, allowing the robot to autonomously approach specified objects.