6533b86ffe1ef96bd12cdacd

RESEARCH PRODUCT

Visual Marker Guided Point Cloud Registration in a Large Multi-Sensor Industrial Robot Cell

Knut Berg KaldestadJoacim DybedalAtle AalerudErind UjkaniGeir Hovland

subject

Computer sciencebusiness.industry010401 analytical chemistryPoint cloudIterative closest pointCloud computing02 engineering and technology01 natural sciences0104 chemical sciencesVisualizationlaw.inventionIndustrial robotlaw0202 electrical engineering electronic engineering information engineeringBenchmark (computing)CalibrationRGB color model020201 artificial intelligence & image processingComputer visionArtificial intelligencebusiness

description

This paper presents a benchmark and accuracy analysis of 3D sensor calibration in a large industrial robot cell. The sensors used were the Kinect v2 which contains both an RGB and an IR camera measuring depth based on the time-of-flight principle. The approach taken was based on a novel procedure combining Aruco visual markers, methods using region of interest and iterative closest point. The calibration of sensors is performed pairwise, exploiting the fact that time-of-flight sensors can have some overlap in the generated point cloud data. For a volume measuring 10m × 14m × 5m a typical accuracy of the generated point cloud data of 5–10cm was achieved using six sensor nodes.

https://doi.org/10.1109/mesa.2018.8449195