curobo.examples.getting_started.volumetric_mapping module¶
Fuse RGB-D frames into a TSDF world model and compute an ESDF for collision-aware planning.
cuRobo builds a persistent volumetric world model from depth observations and known geometry, then generates a dense Euclidean Signed Distance Field (ESDF) that enables fast, differentiable collision queries for motion generation. Depth frames are fused into a block-sparse TSDF via lock-free voxel-centric integration kernels, while analytic primitives (cuboids, meshes) are stamped directly into a separate geometry channel. On demand, an ESDF is computed at task-appropriate resolution using the Parallel Banding Algorithm (PBA+), providing \(O(1)\) trilinear distance queries for robot collision spheres.
This tutorial walks through building a 3D map from an RGB-D video sequence and
using it for collision-aware robot motion planning. You will learn how cuRobo’s
Mapper API fuses depth frames into a compact world model and
generates a signed distance field that robot planners can query efficiently.
By the end of this tutorial you will have:
Fused a sequence of depth images into a block-sparse TSDF
Stamped a known obstacle (cuboid) into the map as analytic geometry
Computed a dense ESDF over the workspace
Rendered a depth image and surface normals from any camera pose
Extracted a colored triangle mesh of the reconstruction
Step 1: Download the dataset¶
This tutorial uses the Sun3D indoor RGB-D dataset, which provides color images, depth maps, and ground-truth camera poses.
Quick start (downloads a single scene, ~1400 MB):
wget http://3dvision.princeton.edu/projects/2016/3DMatch/downloads/rgbd-datasets/sun3d-mit_76_studyroom-76-1studyroom2.zip
mkdir -p datasets/sun3d
unzip sun3d-mit_76_studyroom-76-1studyroom2.zip -d datasets/sun3d
The extracted directory should look like:
datasets/sun3d/sun3d-mit_76_studyroom-76-1studyroom2/
camera-intrinsics.txt
<sequence_name>/
000001.color.png
000001.depth.png
000001.pose.txt
...
Step 2: Run the tutorial¶
python -m curobo.examples.getting_started.volumetric_mapping --root ./datasets/sun3d/sun3d-mit_76_studyroom-76-1studyroom2
To explore the reconstruction interactively, add --visualize. This starts a
Viser server you can open in your browser at
http://localhost:8080. Drag the gizmo to inspect ESDF slices through the scene.
python -m curobo.examples.getting_started.volumetric_mapping --root ./datasets/sun3d/sun3d-mit_76_studyroom-76-1studyroom2 --visualize
Step 3: Check the output¶
When the tutorial finishes successfully you will see:
Loading Sun3D dataset from ./datasets/sun3d...
Found 200 frames
Mapper initialized: 42.0 MB
Integrating 200 frames...
Rendering from first camera pose...
Saved renders to: ~/.cache/curobo/examples/volumetric_mapping
Computing ESDF...
Extracting mesh...
Saved mesh: output_mesh.ply (150,000 vertices)
The following files are written to ~/.cache/curobo/examples/volumetric_mapping/
(override with curobo._src.runtime.cache_dir):
rendered_depth.png: depth colormap rendered from the first camera poserendered_normals.png: surface normal colormaprendered_shaded.png: Phong-shaded surface viewoutput_mesh.ply: colored triangle mesh of the full reconstruction
- create_scene_with_static_obstacles(
- device='cuda:0',
Create a scene with a cuboid and a mesh to stamp into the TSDF geometry channel.
These primitives are integrated analytically (not from depth) and do not decay, demonstrating how known geometry (e.g., a table, a robot link) can be fused alongside depth observations.
- extract_esdf_slice(
- esdf_grid,
- origin,
- voxel_size,
- slice_pose,
- slice_size=1.0,
- slice_resolution=128,
Extract a 2D slice of the ESDF at the given pose.
- Parameters:
esdf_grid (
Tensor) – (nx, ny, nz) ESDF tensor (VoxelGrid convention: X slowest, Z fastest).origin (
Tensor) – Grid center (3,).voxel_size (
float) – Size of each voxel.slice_pose (
ndarray) – 4x4 homogeneous transform defining slice plane.slice_size (
float) – Physical size of slice in meters.slice_resolution (
int) – Number of pixels per side.
- Return type:
- Returns:
RGB image with blue=inside, white=zero, red=outside.
- class Sun3dDataset(
- root,
- device='cuda',
- depth_scale=0.001,
Bases:
torch.utils.data.dataset.DatasetSun3D RGB-D dataset loader.
- main()¶