ACloudViewer  3.9.4
A Modern Library for 3D Data Processing
cloudViewer::t::pipelines::odometry Namespace Reference

Classes

class  OdometryConvergenceCriteria
 
class  OdometryResult
 
class  OdometryLossParams
 

Enumerations

enum class  Method { PointToPlane , Intensity , Hybrid }
 

Functions

OdometryResult RGBDOdometryMultiScalePointToPlane (const RGBDImage &source, const RGBDImage &target, const Tensor &intrinsics, const Tensor &trans, const float depth_scale, const float depth_max, const std::vector< OdometryConvergenceCriteria > &criteria, const OdometryLossParams &params)
 
OdometryResult RGBDOdometryMultiScaleIntensity (const RGBDImage &source, const RGBDImage &target, const Tensor &intrinsic, const Tensor &trans, const float depth_scale, const float depth_max, const std::vector< OdometryConvergenceCriteria > &criteria, const OdometryLossParams &params)
 
OdometryResult RGBDOdometryMultiScaleHybrid (const RGBDImage &source, const RGBDImage &target, const Tensor &intrinsics, const Tensor &trans, const float depth_scale, const float depth_max, const std::vector< OdometryConvergenceCriteria > &criteria, const OdometryLossParams &params)
 
OdometryResult RGBDOdometryMultiScale (const t::geometry::RGBDImage &source, const t::geometry::RGBDImage &target, const core::Tensor &intrinsics, const core::Tensor &init_source_to_target=core::Tensor::Eye(4, core::Float64, core::Device("CPU:0")), const float depth_scale=1000.0f, const float depth_max=3.0f, const std::vector< OdometryConvergenceCriteria > &criteria_list={10, 5, 3}, const Method method=Method::Hybrid, const OdometryLossParams &params=OdometryLossParams())
 Create an RGBD image pyramid given the original source and target RGBD images, and perform hierarchical odometry using specified method. Can be used for offline odometry where we do not expect to push performance to the extreme and not reuse vertex/normal map computed before. Input RGBD images hold a depth image (UInt16 or Float32) with a scale factor and a color image (UInt8 x 3). More...
 
OdometryResult ComputeOdometryResultPointToPlane (const core::Tensor &source_vertex_map, const core::Tensor &target_vertex_map, const core::Tensor &target_normal_map, const core::Tensor &intrinsics, const core::Tensor &init_source_to_target, const float depth_outlier_trunc, const float depth_huber_delta)
 Estimates the 4x4 rigid transformation T from source to target, with inlier rmse and fitness. Performs one iteration of RGBD odometry using loss function $[(V_p - V_q)^T N_p]^2$, where $ V_p $ denotes the vertex at pixel p in the source, $ V_q $ denotes the vertex at pixel q in the target, $ N_p $ denotes the normal at pixel p in the source. q is obtained by transforming p with init_source_to_target then projecting with intrinsics. KinectFusion, ISMAR 2011. More...
 
OdometryResult ComputeOdometryResultIntensity (const core::Tensor &source_depth, const core::Tensor &target_depth, const core::Tensor &source_intensity, const core::Tensor &target_intensity, const core::Tensor &target_intensity_dx, const core::Tensor &target_intensity_dy, const core::Tensor &source_vertex_map, const core::Tensor &intrinsics, const core::Tensor &init_source_to_target, const float depth_outlier_trunc, const float intensity_huber_delta)
 Estimates the 4x4 rigid transformation T from source to target, with inlier rmse and fitness. Performs one iteration of RGBD odometry using loss function $(I_p - I_q)^2$, where $ I_p $ denotes the intensity at pixel p in the source, $ I_q $ denotes the intensity at pixel q in the target. q is obtained by transforming p with init_source_to_target then projecting with intrinsics. Real-time visual odometry from dense RGB-D images, ICCV Workshops, 2011. More...
 
OdometryResult ComputeOdometryResultHybrid (const core::Tensor &source_depth, const core::Tensor &target_depth, const core::Tensor &source_intensity, const core::Tensor &target_intensity, const core::Tensor &target_depth_dx, const core::Tensor &target_depth_dy, const core::Tensor &target_intensity_dx, const core::Tensor &target_intensity_dy, const core::Tensor &source_vertex_map, const core::Tensor &intrinsics, const core::Tensor &init_source_to_target, const float depth_outlier_trunc, const float depth_huber_delta, const float intensity_huber_delta)
 Estimates the 4x4 rigid transformation T from source to target, with inlier rmse and fitness. Performs one iteration of RGBD odometry using loss function $(I_p - I_q)^2 + \lambda(D_p - (D_q)')^2$, where $ I_p $ denotes the intensity at pixel p in the source, $ I_q $ denotes the intensity at pixel q in the target. $ D_p $ denotes the depth pixel p in the source, $ D_q $ denotes the depth pixel q in the target. q is obtained by transforming p with init_source_to_target then projecting with intrinsics. Reference: J. Park, Q.Y. Zhou, and V. Koltun, Colored Point Cloud Registration Revisited, ICCV, 2017. More...
 
core::Tensor ComputeOdometryInformationMatrix (const geometry::Image &source_depth, const geometry::Image &target_depth, const core::Tensor &intrinsic, const core::Tensor &source_to_target, const float dist_thr, const float depth_scale, const float depth_max)
 
static core::Tensor CreateIntrisicTensor ()
 
static void ComputeOdometryResultPointToPlane (benchmark::State &state, const core::Device &device)
 
static void RGBDOdometryMultiScale (benchmark::State &state, const core::Device &device, const t::pipelines::odometry::Method &method)
 
 Unit (benchmark::kMillisecond)
 
 BENCHMARK_CAPTURE (RGBDOdometryMultiScale, Hybrid_CPU, core::Device("CPU:0"), t::pipelines::odometry::Method::Hybrid) -> Unit(benchmark::kMillisecond)
 
 BENCHMARK_CAPTURE (RGBDOdometryMultiScale, Intensity_CPU, core::Device("CPU:0"), t::pipelines::odometry::Method::Intensity) -> Unit(benchmark::kMillisecond)
 
 BENCHMARK_CAPTURE (RGBDOdometryMultiScale, PointToPlane_CPU, core::Device("CPU:0"), t::pipelines::odometry::Method::PointToPlane) -> Unit(benchmark::kMillisecond)
 
void pybind_odometry (py::module &m)
 

Variables

static const std::unordered_map< std::string, std::string > map_shared_argument_docstrings
 

Enumeration Type Documentation

◆ Method

Enumerator
PointToPlane 
Intensity 
Hybrid 

Definition at line 23 of file RGBDOdometry.h.

Function Documentation

◆ BENCHMARK_CAPTURE() [1/3]

cloudViewer::t::pipelines::odometry::BENCHMARK_CAPTURE ( RGBDOdometryMultiScale  ,
Hybrid_CPU  ,
core::Device("CPU:0")  ,
t::pipelines::odometry::Method::Hybrid   
) -> Unit(benchmark::kMillisecond)

◆ BENCHMARK_CAPTURE() [2/3]

cloudViewer::t::pipelines::odometry::BENCHMARK_CAPTURE ( RGBDOdometryMultiScale  ,
Intensity_CPU  ,
core::Device("CPU:0")  ,
t::pipelines::odometry::Method::Intensity   
) -> Unit(benchmark::kMillisecond)

◆ BENCHMARK_CAPTURE() [3/3]

cloudViewer::t::pipelines::odometry::BENCHMARK_CAPTURE ( RGBDOdometryMultiScale  ,
PointToPlane_CPU  ,
core::Device("CPU:0")  ,
t::pipelines::odometry::Method::PointToPlane   
) -> Unit(benchmark::kMillisecond)

◆ ComputeOdometryInformationMatrix()

core::Tensor cloudViewer::t::pipelines::odometry::ComputeOdometryInformationMatrix ( const geometry::Image source_depth,
const geometry::Image target_depth,
const core::Tensor intrinsic,
const core::Tensor source_to_target,
const float  dist_thr,
const float  depth_scale = 1000.0f,
const float  depth_max = 3.0f 
)

Estimates 6x6 information matrix from a pair of depth images. The process is akin to information matrix creation for point clouds.

Definition at line 489 of file RGBDOdometry.cpp.

References cloudViewer::t::geometry::Image::AsTensor(), cloudViewer::t::geometry::Image::ClipTransform(), cloudViewer::t::pipelines::kernel::odometry::ComputeOdometryInformationMatrix(), and cloudViewer::t::geometry::Image::CreateVertexMap().

Referenced by pybind_odometry().

◆ ComputeOdometryResultHybrid()

OdometryResult cloudViewer::t::pipelines::odometry::ComputeOdometryResultHybrid ( const core::Tensor source_depth,
const core::Tensor target_depth,
const core::Tensor source_intensity,
const core::Tensor target_intensity,
const core::Tensor target_depth_dx,
const core::Tensor target_depth_dy,
const core::Tensor target_intensity_dx,
const core::Tensor target_intensity_dy,
const core::Tensor source_vertex_map,
const core::Tensor intrinsics,
const core::Tensor init_source_to_target,
const float  depth_outlier_trunc,
const float  depth_huber_delta,
const float  intensity_huber_delta 
)

Estimates the 4x4 rigid transformation T from source to target, with inlier rmse and fitness. Performs one iteration of RGBD odometry using loss function $(I_p - I_q)^2 + \lambda(D_p - (D_q)')^2$, where $ I_p $ denotes the intensity at pixel p in the source, $ I_q $ denotes the intensity at pixel q in the target. $ D_p $ denotes the depth pixel p in the source, $ D_q $ denotes the depth pixel q in the target. q is obtained by transforming p with init_source_to_target then projecting with intrinsics. Reference: J. Park, Q.Y. Zhou, and V. Koltun, Colored Point Cloud Registration Revisited, ICCV, 2017.

Parameters
source_depth(rows, cols, channels=1) Float32 source depth image obtained by PreprocessDepth before calling this function.
target_depth(rows, cols, channels=1) Float32 target depth image obtained by PreprocessDepth before calling this function.
source_intensity(rows, cols, channels=1) Float32 source intensity image obtained by RGBToGray before calling this function.
target_intensity(rows, cols, channels=1) Float32 target intensity image obtained by RGBToGray before calling this function.
target_depth_dx(rows, cols, channels=1) Float32 target depth gradient image along x-axis obtained by FilterSobel before calling this function.
target_depth_dy(rows, cols, channels=1) Float32 target depth gradient image along y-axis obtained by FilterSobel before calling this function.
target_intensity_dx(rows, cols, channels=1) Float32 target intensity gradient image along x-axis obtained by FilterSobel before calling this function.
target_intensity_dy(rows, cols, channels=1) Float32 target intensity gradient image along y-axis obtained by FilterSobel before calling this function.
source_vertex_map(rows, cols, channels=3) Float32 source vertex image obtained by CreateVertexMap before calling this function.
intrinsics(3, 3) intrinsic matrix for projection.
init_source_to_target(4, 4) initial transformation matrix from source to target.
depth_outlier_truncDepth difference threshold used to filter projective associations.
depth_huber_deltaHuber norm parameter used in depth loss.
intensity_huber_deltaHuber norm parameter used in intensity loss.
Returns
odometry result, with(4, 4) optimized transformation matrix from source to target, inlier ratio, and fitness.

Definition at line 453 of file RGBDOdometry.cpp.

References cloudViewer::t::pipelines::kernel::odometry::ComputeOdometryResultHybrid(), cloudViewer::core::Tensor::GetShape(), LogError, and cloudViewer::t::pipelines::kernel::PoseToTransformation().

Referenced by pybind_odometry(), and RGBDOdometryMultiScaleHybrid().

◆ ComputeOdometryResultIntensity()

OdometryResult cloudViewer::t::pipelines::odometry::ComputeOdometryResultIntensity ( const core::Tensor source_depth,
const core::Tensor target_depth,
const core::Tensor source_intensity,
const core::Tensor target_intensity,
const core::Tensor target_intensity_dx,
const core::Tensor target_intensity_dy,
const core::Tensor source_vertex_map,
const core::Tensor intrinsics,
const core::Tensor init_source_to_target,
const float  depth_outlier_trunc,
const float  intensity_huber_delta 
)

Estimates the 4x4 rigid transformation T from source to target, with inlier rmse and fitness. Performs one iteration of RGBD odometry using loss function $(I_p - I_q)^2$, where $ I_p $ denotes the intensity at pixel p in the source, $ I_q $ denotes the intensity at pixel q in the target. q is obtained by transforming p with init_source_to_target then projecting with intrinsics. Real-time visual odometry from dense RGB-D images, ICCV Workshops, 2011.

Parameters
source_depth(rows, cols, channels=1) Float32 source depth image obtained by PreprocessDepth before calling this function.
target_depth(rows, cols, channels=1) Float32 target depth image obtained by PreprocessDepth before calling this function.
source_intensity(rows, cols, channels=1) Float32 source intensity image obtained by RGBToGray before calling this function.
target_intensity(rows, cols, channels=1) Float32 target intensity image obtained by RGBToGray before calling this function.
target_intensity_dx(rows, cols, channels=1) Float32 target intensity gradient image along x-axis obtained by FilterSobel before calling this function.
target_intensity_dy(rows, cols, channels=1) Float32 target intensity gradient image along y-axis obtained by FilterSobel before calling this function.
source_vertex_map(rows, cols, channels=3) Float32 source vertex image obtained by CreateVertexMap before calling this function.
intrinsics(3, 3) intrinsic matrix for projection.
init_source_to_target(4, 4) initial transformation matrix from source to target.
depth_outlier_truncDepth difference threshold used to filter projective associations.
intensity_huber_deltaHuber norm parameter used in intensity loss.
Returns
odometry result, with(4, 4) optimized transformation matrix from source to target, inlier ratio, and fitness.

Definition at line 420 of file RGBDOdometry.cpp.

References cloudViewer::t::pipelines::kernel::odometry::ComputeOdometryResultIntensity(), cloudViewer::core::Tensor::GetShape(), LogError, and cloudViewer::t::pipelines::kernel::PoseToTransformation().

Referenced by pybind_odometry(), and RGBDOdometryMultiScaleIntensity().

◆ ComputeOdometryResultPointToPlane() [1/2]

◆ ComputeOdometryResultPointToPlane() [2/2]

OdometryResult cloudViewer::t::pipelines::odometry::ComputeOdometryResultPointToPlane ( const core::Tensor source_vertex_map,
const core::Tensor target_vertex_map,
const core::Tensor target_normal_map,
const core::Tensor intrinsics,
const core::Tensor init_source_to_target,
const float  depth_outlier_trunc,
const float  depth_huber_delta 
)

Estimates the 4x4 rigid transformation T from source to target, with inlier rmse and fitness. Performs one iteration of RGBD odometry using loss function $[(V_p - V_q)^T N_p]^2$, where $ V_p $ denotes the vertex at pixel p in the source, $ V_q $ denotes the vertex at pixel q in the target, $ N_p $ denotes the normal at pixel p in the source. q is obtained by transforming p with init_source_to_target then projecting with intrinsics. KinectFusion, ISMAR 2011.

Parameters
source_vertex_map(rows, cols, channels=3) Float32 source vertex image obtained by CreateVertexMap before calling this function.
target_vertex_map(rows, cols, channels=3) Float32 target vertex image obtained by CreateVertexMap before calling this function.
target_normal_map(rows, cols, channels=3) Float32 target normal image obtained by CreateNormalMap before calling this function.
intrinsics(3, 3) intrinsic matrix for projection.
init_source_to_target(4, 4) initial transformation matrix from source to target.
depth_outlier_truncDepth difference threshold used to filter projective associations.
depth_huber_deltaHuber norm parameter used in depth loss.
Returns
odometry result, with (4, 4) optimized transformation matrix from source to target, inlier ratio, and fitness.

Definition at line 392 of file RGBDOdometry.cpp.

References cloudViewer::t::pipelines::kernel::odometry::ComputeOdometryResultPointToPlane(), cloudViewer::core::Tensor::GetShape(), LogError, and cloudViewer::t::pipelines::kernel::PoseToTransformation().

Referenced by ComputeOdometryResultPointToPlane(), pybind_odometry(), and RGBDOdometryMultiScalePointToPlane().

◆ CreateIntrisicTensor()

◆ pybind_odometry()

◆ RGBDOdometryMultiScale() [1/2]

◆ RGBDOdometryMultiScale() [2/2]

OdometryResult cloudViewer::t::pipelines::odometry::RGBDOdometryMultiScale ( const t::geometry::RGBDImage source,
const t::geometry::RGBDImage target,
const core::Tensor intrinsics,
const core::Tensor init_source_to_target = core::Tensor::Eye(4, core::Float64core::Device("CPU:0")),
const float  depth_scale = 1000.0f,
const float  depth_max = 3.0f,
const std::vector< OdometryConvergenceCriteria > &  criteria_list = {10, 5, 3},
const Method  method = Method::Hybrid,
const OdometryLossParams params = OdometryLossParams() 
)

Create an RGBD image pyramid given the original source and target RGBD images, and perform hierarchical odometry using specified method. Can be used for offline odometry where we do not expect to push performance to the extreme and not reuse vertex/normal map computed before. Input RGBD images hold a depth image (UInt16 or Float32) with a scale factor and a color image (UInt8 x 3).

Parameters
sourceSource RGBD image.
targetTarget RGBD image.
intrinsics(3, 3) intrinsic matrix for projection of core::Float64 on CPU.
init_source_to_target(4, 4) initial transformation matrix from source to target of core::Float64 on CPU.
depth_scaleConverts depth pixel values to meters by dividing the scale factor.
depth_maxMax depth to truncate depth image with noisy measurements.
criteria_listCriteria used to define and terminate iterations. In multiscale odometry the order is from coarse to fine. Inputting a vector of iterations by default triggers the implicit conversion.
methodMethod used to apply RGBD odometry.
paramsParameters used in loss function, including outlier rejection threshold and Huber norm parameters.
Returns
odometry result, with (4, 4) optimized transformation matrix from source to target, inlier ratio, and fitness.

Definition at line 56 of file RGBDOdometry.cpp.

References AssertTensorDevice, AssertTensorShape, cloudViewer::t::geometry::Image::AsTensor(), cloudViewer::t::geometry::Image::ClipTransform(), cloudViewer::core::Tensor::Clone(), cloudViewer::t::geometry::RGBDImage::color_, cloudViewer::t::geometry::RGBDImage::depth_, cloudViewer::core::Float64, cloudViewer::t::geometry::Image::GetDevice(), Hybrid, Intensity, LogError, params, PointToPlane, RGBDOdometryMultiScaleHybrid(), RGBDOdometryMultiScaleIntensity(), RGBDOdometryMultiScalePointToPlane(), and cloudViewer::core::Tensor::To().

Referenced by pybind_odometry(), RGBDOdometryMultiScale(), and cloudViewer::t::pipelines::slam::Model::TrackFrameToModel().

◆ RGBDOdometryMultiScaleHybrid()

◆ RGBDOdometryMultiScaleIntensity()

◆ RGBDOdometryMultiScalePointToPlane()

OdometryResult cloudViewer::t::pipelines::odometry::RGBDOdometryMultiScalePointToPlane ( const RGBDImage source,
const RGBDImage target,
const Tensor intrinsics,
const Tensor trans,
const float  depth_scale,
const float  depth_max,
const std::vector< OdometryConvergenceCriteria > &  criteria,
const OdometryLossParams params 
)

◆ Unit()

cloudViewer::t::pipelines::odometry::Unit ( benchmark::kMillisecond  )

Variable Documentation

◆ map_shared_argument_docstrings

const std::unordered_map<std::string, std::string> cloudViewer::t::pipelines::odometry::map_shared_argument_docstrings
static

Definition at line 20 of file odometry.cpp.

Referenced by pybind_odometry().