0000000000255563
AUTHOR
Erind Ujkani
Industrial Environment Mapping Using Distributed Static 3D Sensor Nodes
This paper presents a system architecture for mapping and real-time monitoring of a relatively large industrial robotic environment of size 10 m × 15 m × 5 m. Six sensor nodes with embedded computing power and local processing of the 3D point clouds are placed close to the ceiling. The system architecture and data processing is based on the Robot Operating System (ROS) and the Point Cloud Library (PCL). The 3D sensors used are the Microsoft Kinect for Xbox One and point cloud data is collected at 20 Hz. A new manual calibration procedure is developed using reflective planes. The specified range of the used sensor is 0.8 m to 4.2 m, while depth data up to 9 m is used in this paper. Despite t…
Real-time human collision detection for industrial robot cells
A collision detection system triggering on human motion was developed using the Robot Operating System (ROS) and the Point Cloud Library (PCL). ROS was used as the core of the programs and for the communication with an industrial robot. Combining the depths fields from the 3D cameras was accomplished by the use of PCL. The library was also the underlying tool for segmenting the human from the registrated point clouds. Benchmarking of several collision algorithms was done in order to compare the solution. The registration process gave satisfactory results when testing the repetitiveness and the accuracy of the implementation. The segmentation algorithm was able to segment a person represente…
Visual Marker Guided Point Cloud Registration in a Large Multi-Sensor Industrial Robot Cell
This paper presents a benchmark and accuracy analysis of 3D sensor calibration in a large industrial robot cell. The sensors used were the Kinect v2 which contains both an RGB and an IR camera measuring depth based on the time-of-flight principle. The approach taken was based on a novel procedure combining Aruco visual markers, methods using region of interest and iterative closest point. The calibration of sensors is performed pairwise, exploiting the fact that time-of-flight sensors can have some overlap in the generated point cloud data. For a volume measuring 10m × 14m × 5m a typical accuracy of the generated point cloud data of 5–10cm was achieved using six sensor nodes.