π For ROS1 version and ROS1 Docker environment, please go to the main branch.
π We will present VDB-GPDF at IROS 2025, Session Mapping 4. See you there!
The key aspect is a latent Local GP Signed Distance Field (L-GPDF) contained in a local VDB structure that allows fast queries of the Euclidean distance, surface properties and their uncertainties for points in the field of view. Probabilistic fusion is then performed by merging the inferred values of these points into a global VDB structure that is efficiently maintained over time. After fusion, the surface mesh is recovered, and a global GP Signed Distance Field (G-GPDF) is generated and made available for downstream applications to query accurate distance and gradients.
VDB_GPDF can work with depth cameras and LiDAR datasets. For ROS1, It is tested using the Cow and Lady, Kitti, Newer College, Mai City datasets.
For ROS2, we only support the cow and lady dataset with roslaunch and yaml but you can create other ROS2 config files based on the ROS1 config files. You can also modify the parameters in roslaunch, and config yamls to work with your own dataset. To run it with a live sensor, please disable the data_buf in yaml.
Temporary link for a section of the Cow and Lady dataset (converted_ros2_bag.zip) for you to try quickly with ros2.
# build through Dockerfile
docker build -t vdbgpdf_ros2:humble .# to run with host to have rviz visualisation
xhost +local:docker# please download the converted_ros2_bag.zip, unzip it and modify the path in the following command
docker run -it \
--net=host \
-e DISPLAY=$DISPLAY \
-e QT_X11_NO_MITSHM=1 \
-v /tmp/.X11-unix:/tmp/.X11-unix \
-v /dev/shm:/dev/shm \
-v /home/lan/Data/converted_ros2_bag:/workspace/data \
--name vdbgpdf_ros2 \
vdbgpdf_ros2:humble# please double check the path
ros2 launch vdb_gpdf_mapping vdb_gpdf_mapping_cow.py \
bag_file:=/workspace/data/converted_ros2_bag.db3Then the rviz will pop up and show the mapping. We can run the service to have the full reconstruction and query the distance field.
To deploy it on your own pc environment. Please install some dependencies. Tested system: Ubuntu 22.04 and ROS2 humble.
- python3 (3.10.12)
- eigen (3.4.0)
- gflag (2.2.2)
- glog (0.5.0)
- openmp (4.5)
- boost (1.80)
- OpenVDB: (9.1.1)
If you have ROS2 already, then hopefully installing the OpenVDB will be enough.
system
sudo apt-get update && sudo apt-get install -y libblosc-dev \
libboost-iostreams-dev \
libboost-system-dev \
libboost-system-dev \
libeigen3-dev \
&& rm -rf /var/lib/apt/lists/*Install openvdb from source
git clone --depth 1 https://github.com/nachovizzo/openvdb.git -b nacho/vdbfusion \
&& cd openvdb \
&& mkdir build && cd build \
&& cmake -DCMAKE_POSITION_INDEPENDENT_CODE=ON -DUSE_ZLIB=OFF .. \
&& make -j$(nproc) all install # we can use -j4 to save ramreference from: http://gitlab.ram-lab.com/ramlab_dataset_sensor/mapping_codebase/vdbmapping
more official install and problems, please click here: https://www.openvdb.org/documentation/doxygen/build.html
git clone --recurse-submodules [email protected]:UTS-RI/VDB_GPDF.gitPlease remove the catkin_simple package and minkindr package in 3dparty folder if you have them already.
After sourcing your bash, run the roslaunch for the cow and lady dataset directly.
ros2 launch vdb_gpdf_mapping vdb_gpdf_mapping_cow.py bag_file:=/home/lan/Data/converted_ros2_bag/converted_ros2_bag.db3During or after the running, we can call the services. The save_map service is to save the map as .pcd and .ply under your path. After calling, it will also publish the vertices (as a point cloud) of the reconstructed mesh too. You can visualise in rviz (topic name: /vdbmap).
string path
float64 filter_res
---
bool successTo call the above service for saving the map, please use this example:
ros2 service call /save_map vdb_gpdf_mapping_msgs/srv/SaveMap "{filter_res: 0.005, path: '/home/lan/Downloads/'}"We have two query_map services to query the representation for distance and gradients. The input is a set of points or a slice, and the output contains time stamps, gradients, distances, and if observed.
Points query service: /points_query_map
float64[] points
---
builtin_interfaces/Time stamp
float64[] gradients
float64[] distances
bool[] if_observedSlice query service: /slice_query_map
float64[] slice_ranges
---
builtin_interfaces/Time stamp
float64[] points
float64[] gradients
float64[] distances
bool[] if_observedThe boundary of the main area of the cow and lady dataset is:
x: -3, 0.5
y: -3, 3
z: 0, 2.8Example: To query 4 points of the cow and lady dataset:
ros2 service call /points_query_map vdb_gpdf_mapping_msgs/srv/PointsQueryMap "{points: [-1.5,0,0.9,-1.5,0,1.0,-1.5,0,0.95,-1.5,0,0.96]}"Example: To query a slice of the cow and lady dataset:
ros2 service call /slice_query_map vdb_gpdf_mapping_msgs/srv/SliceQueryMap "{slice_ranges: [-3.0,3.0,-3.0,0.3,0.9,0.05]}"@article{wu2025vdb,
author={Wu, Lan and Le Gentil, Cedric and Vidal-Calleja, Teresa},
journal={IEEE Robotics and Automation Letters},
title={VDB-GPDF: Online Gaussian Process Distance Field With VDB Structure},
year={2025},
volume={10},
number={1},
pages={374-381},
doi={10.1109/LRA.2024.3505814}
}
We sincerely thank the authors of the repositories listed below for making their code available.
-
Kin-Zhang/vdbfusion_mapping: for providing the code structure and fixing the pose issue
-
PRBonn/vdbfusion_ros the vdbfusion framework
-
jianhao jiao: the first version on vdbfusion mapping ros
-
paucarre: the rgb version on vdbfusion
-
ethz-asl/voxblox: the voxblox framework with mesh and distance field
All parameters and dafault values can be found as yaml files in the config folder.
| Parameter | Description | Type | Recommended Value |
|---|---|---|---|
| lider_topic | input point cloud topic | std::string | depend on dataset |
| pose_source | we have three options for the input pose | int | 0: tf_tree option, need to provide child_frame and world_frame; 1: tf_topic option, need to provide tf_topic; 2: odom_topic option, need to provide odom_topic |
| world_frame | world frame for the map | std::string | "world" |
| timestamp_tolerance_ms | time tolerance between the pose and the raw point cloud in ms | double | 2 |
| debug_print | if to print some debugging information | bool | false |
| enable_databuf | to enable a databuf for the incoming point clouds | bool | false: live camera; true: pre-recorded dataset |
| min_scan_range | min scan range respect to the sensor position | double | depend on dataset |
| max_scan_range | max scan range respect to the sensor position | double | depend on dataset |
| max_height | max height respect to the sensor position | double | depend on dataset |
| fill_holes | to fill holes for the mesh reconstruction | bool | true |
| use_color | to use color as surface property for fusion and gp inference | bool | true |
| sdf_trunc | this param is for vdbfusion comparison | float | depend on dataset and voxel resolution |
| space_carving | this param is for vdbfusion comparison | bool | true |
| distance_method | method to compute the distance | int | 0: RevertingGPDF, refer the paper "Accurate Gaussian-Process-based Distance Fields with applications to Echolocation and Mapping"; 1: LogGPDF, refer the paper "Faithful Euclidean Distance Field from Log-Gaussian Process Implicit Surfaces" |
| sensor_type | to choose depth camera or lidar | int | 0: depth camera; 1: lidar |
| voxel_size_local | voxel size for local gp model | float | depend on dataset |
| voxel_overlapping | if to have overlapping area between local gps | int | positive: enable with value to control overlapping area; negative: disable |
| voxel_downsample | further downsample even after the voxelisation | int | positive: enable with an int value to control the further downsampling; 1: disable |
| voxel_size_global | voxel size for global gp model. this is the map resolution | float | depend on dataset. this is the map resolution |
| variance_method | fusion method | int | 0: not use variance, constant weight; 1: use variance, weight = 1-variance; 2: use occupancy u(x) variance, probabilistic fusion, 1/variance; 3: use distance v(x) variance, probabilistic fusion, 1/variance |
| variance_cap | cap the variance | float | 0.5 |
| variance_on_surface | to setup a variance on the surface | float | 0.001 |
| recon_min_weight | min weight for the mesh | double | 0.5 |
| surface_normal_method | we have two options to compute surface normals | int | 0: pcl library; 1: raycasting |
| surface_normal_num | number of points to contribute the normal | int | 10 |
| surface_value | setup the surface points value | float | -1: set up the surface value as inside; 1: set up the surface value as outside |
| query_iterval | query voxels for local gp: voxel interval | float | same as map resolution |
| query_trunc_in | query voxels for local gp: trunc inside of the surface | int | depend on dataset |
| query_trunc_out | query voxels for local gp: trunc outside of the surface | int | depend on dataset |
| freespace_iterval | query voxels for local gp: voxel interval in freespace, important for dynamic performance | float | same as map resolution |
| freespace_trunc_out | query voxels for local gp: trunc outside of the surface | int | depend on dataset |
| query_downsample | after we have all query voxels for local gp, if we downsample it | float | positive: enable with a value for the downsampling voxel resolution; negative: disable |
| map_lambda_scale | RevertingGPDF: map_lambda_scale = 1/(length_scale)^2; LogGPDF: map_lambda_scale = sqrt(3)/length_scale. length_scale is the same definition as in gp kernel |
double | depend on dataset |
| map_noise | map noise | double | depend on dataset |
| color_scale | color length scale | double | 10 |
| smooth_param | smooth param to control the accuracy | double | 100 |
| tx,ty,tz,x,y,z,w | static tf for pose | double | depend on dataset |
| invert_static_tf | if invert static tf | bool | depend on dataset |

