6533b856fe1ef96bd12b27e8
RESEARCH PRODUCT
P2D: a self-supervised method for depth estimation from polarimetry
Olivier MorelRalph SeulinDaniel BraunDésiré SidibéMarc BlanchonFabrice Meriaudeausubject
FOS: Computer and information sciences0209 industrial biotechnologyMonocularComputer sciencebusiness.industryComputer Vision and Pattern Recognition (cs.CV)PolarimetryComputer Science - Computer Vision and Pattern RecognitionComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION[INFO.INFO-CV]Computer Science [cs]/Computer Vision and Pattern Recognition [cs.CV]02 engineering and technology010501 environmental sciences01 natural sciencesRegularization (mathematics)Term (time)020901 industrial engineering & automation[INFO.INFO-CV] Computer Science [cs]/Computer Vision and Pattern Recognition [cs.CV]SpecularityRobustness (computer science)Depth mapComputer visionArtificial intelligenceTransparency (data compression)business0105 earth and related environmental sciencesdescription
Monocular depth estimation is a recurring subject in the field of computer vision. Its ability to describe scenes via a depth map while reducing the constraints related to the formulation of perspective geometry tends to favor its use. However, despite the constant improvement of algorithms, most methods exploit only colorimetric information. Consequently, robustness to events to which the modality is not sensitive to, like specularity or transparency, is neglected. In response to this phenomenon, we propose using polarimetry as an input for a self-supervised monodepth network. Therefore, we propose exploiting polarization cues to encourage accurate reconstruction of scenes. Furthermore, we include a term of polarimetric regularization to state-of-the-art method to take specific advantage of the data. Our method is evaluated both qualitatively and quantitatively demonstrating that the contribution of this new information as well as an enhanced loss function improves depth estimation results, especially for specular areas.
year | journal | country | edition | language |
---|---|---|---|---|
2021-01-10 |