0000000000255563

AUTHOR

Erind Ujkani

showing 3 related works from this author

Industrial Environment Mapping Using Distributed Static 3D Sensor Nodes

2018

This paper presents a system architecture for mapping and real-time monitoring of a relatively large industrial robotic environment of size 10 m × 15 m × 5 m. Six sensor nodes with embedded computing power and local processing of the 3D point clouds are placed close to the ceiling. The system architecture and data processing is based on the Robot Operating System (ROS) and the Point Cloud Library (PCL). The 3D sensors used are the Microsoft Kinect for Xbox One and point cloud data is collected at 20 Hz. A new manual calibration procedure is developed using reflective planes. The specified range of the used sensor is 0.8 m to 4.2 m, while depth data up to 9 m is used in this paper. Despite t…

Data processingComputer scienceReal-time computingPoint cloud0102 computer and information sciences02 engineering and technologyCeiling (cloud)01 natural sciences020202 computer hardware & architecture010201 computation theory & mathematics0202 electrical engineering electronic engineering information engineeringBenchmark (computing)Systems architectureCalibrationMetreReflection mapping2018 14th IEEE/ASME International Conference on Mechatronic and Embedded Systems and Applications (MESA)
researchProduct

Real-time human collision detection for industrial robot cells

2017

A collision detection system triggering on human motion was developed using the Robot Operating System (ROS) and the Point Cloud Library (PCL). ROS was used as the core of the programs and for the communication with an industrial robot. Combining the depths fields from the 3D cameras was accomplished by the use of PCL. The library was also the underlying tool for segmenting the human from the registrated point clouds. Benchmarking of several collision algorithms was done in order to compare the solution. The registration process gave satisfactory results when testing the repetitiveness and the accuracy of the implementation. The segmentation algorithm was able to segment a person represente…

0209 industrial biotechnologyComputer sciencebusiness.industryPoint cloudProcess (computing)02 engineering and technologyBenchmarkingCollisionlaw.inventionIndustrial robot020901 industrial engineering & automationlawCollision detectionComputer visionSegmentationArtificial intelligencebusinessCollision avoidance2017 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI)
researchProduct

Visual Marker Guided Point Cloud Registration in a Large Multi-Sensor Industrial Robot Cell

2018

This paper presents a benchmark and accuracy analysis of 3D sensor calibration in a large industrial robot cell. The sensors used were the Kinect v2 which contains both an RGB and an IR camera measuring depth based on the time-of-flight principle. The approach taken was based on a novel procedure combining Aruco visual markers, methods using region of interest and iterative closest point. The calibration of sensors is performed pairwise, exploiting the fact that time-of-flight sensors can have some overlap in the generated point cloud data. For a volume measuring 10m × 14m × 5m a typical accuracy of the generated point cloud data of 5–10cm was achieved using six sensor nodes.

Computer sciencebusiness.industry010401 analytical chemistryPoint cloudIterative closest pointCloud computing02 engineering and technology01 natural sciences0104 chemical sciencesVisualizationlaw.inventionIndustrial robotlaw0202 electrical engineering electronic engineering information engineeringBenchmark (computing)CalibrationRGB color model020201 artificial intelligence & image processingComputer visionArtificial intelligencebusiness2018 14th IEEE/ASME International Conference on Mechatronic and Embedded Systems and Applications (MESA)
researchProduct