6533b874fe1ef96bd12d6190
RESEARCH PRODUCT
Visual contact with catadioptric cameras
Driss AboutajdineSanaa ElfkihiCédric DemonceauxFatima Zahra BenamarFatima Zahra BenamarEl Mustapha Mouaddibsubject
0209 industrial biotechnologyComputer scienceGeneral MathematicsComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONOptical flow02 engineering and technologyCatadioptric system020901 industrial engineering & automationOmnidirectional cameraDepth map0202 electrical engineering electronic engineering information engineering[INFO.INFO-RB]Computer Science [cs]/Robotics [cs.RO]Computer visionComputingMilieux_MISCELLANEOUSPixelbusiness.industryPerspective (graphical)[ INFO.INFO-RB ] Computer Science [cs]/Robotics [cs.RO]Mobile robotReal imageComputer Science ApplicationsControl and Systems EngineeringObstacle020201 artificial intelligence & image processingArtificial intelligencebusinessSoftwaredescription
Abstract Time to contact or time to collision (TTC) is utmost important information for animals as well as for mobile robots because it enables them to avoid obstacles; it is a convenient way to analyze the surrounding environment. The problem of TTC estimation is largely discussed in perspective images. Although a lot of works have shown the interest of omnidirectional camera for robotic applications such as localization, motion, monitoring, few works use omnidirectional images to compute the TTC. In this paper, we show that TTC can be also estimated on catadioptric images. We present two approaches for TTC estimation using directly or indirectly the optical flow based on de-rotation strategy. The first, called “gradient based TTC”, is simple, fast and it does not need an explicit estimation of the optical flow. Nevertheless, this method cannot provide a TTC on each pixel, valid only for para-catadioptric sensors and requires an initial segmentation of the obstacle. The second method, called “TTC map estimation based on optical flow”, estimates TTC on each point on the image and provides the depth map of the environment for any obstacle in any direction and is valid for all central catadioptric sensors. Some results and comparisons in synthetic and real images will be given.
year | journal | country | edition | language |
---|---|---|---|---|
2015-02-01 |