Search results for " vision"

showing 10 items of 2709 documents

Visual contact with catadioptric cameras

2015

Abstract Time to contact or time to collision (TTC) is utmost important information for animals as well as for mobile robots because it enables them to avoid obstacles; it is a convenient way to analyze the surrounding environment. The problem of TTC estimation is largely discussed in perspective images. Although a lot of works have shown the interest of omnidirectional camera for robotic applications such as localization, motion, monitoring, few works use omnidirectional images to compute the TTC. In this paper, we show that TTC can be also estimated on catadioptric images. We present two approaches for TTC estimation using directly or indirectly the optical flow based on de-rotation strat…

0209 industrial biotechnologyComputer scienceGeneral MathematicsComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONOptical flow02 engineering and technologyCatadioptric system020901 industrial engineering & automationOmnidirectional cameraDepth map0202 electrical engineering electronic engineering information engineering[INFO.INFO-RB]Computer Science [cs]/Robotics [cs.RO]Computer visionComputingMilieux_MISCELLANEOUSPixelbusiness.industryPerspective (graphical)[ INFO.INFO-RB ] Computer Science [cs]/Robotics [cs.RO]Mobile robotReal imageComputer Science ApplicationsControl and Systems EngineeringObstacle020201 artificial intelligence & image processingArtificial intelligencebusinessSoftware
researchProduct

Hankelet-based action classification for motor intention recognition

2017

Powered lower-limb prostheses require a natural, and an easy-to-use, interface for communicating amputee’s motor intention in order to select the appropriate motor program in any given context, or simply to commute from active (powered) to passive mode of functioning. To be widely accepted, such an interface should not put additional cognitive load at the end-user, it should be reliable and minimally invasive. In this paper we present a one such interface based on a robust method for detecting and recognizing motor actions from a low-cost wearable sensor network mounted on a sound leg providing inertial (accelerometer, gyrometer and magnetometer) data in real-time. We assume that the sensor…

0209 industrial biotechnologyComputer scienceGeneral MathematicsInterface (computing)Context (language use)02 engineering and technologyAction recognitionLTI system theoryMatrix (mathematics)020901 industrial engineering & automationMatch moving0202 electrical engineering electronic engineering information engineeringMathematics (all)Computer visionObservabilitySettore ING-INF/05 - Sistemi Di Elaborazione Delle Informazionibusiness.industrySystem identificationComputer Science Applications1707 Computer Vision and Pattern RecognitionAction recognition; Motor intention recognition; Powered (active) lower-limb prostheses; Wearable sensor networks; Control and Systems Engineering; Software; Mathematics (all); Computer Science Applications1707 Computer Vision and Pattern RecognitionMotor intention recognitionComputer Science ApplicationsSupport vector machineControl and Systems EngineeringPowered (active) lower-limb prostheseWearable sensor network020201 artificial intelligence & image processingArtificial intelligencebusinessHankel matrixSoftwareRobotics and Autonomous Systems
researchProduct

Adaptive Neural Control of MIMO Nonstrict-Feedback Nonlinear Systems with Time Delay

2016

In this paper, an adaptive neural output-feedback tracking controller is designed for a class of multiple-input and multiple-output nonstrict-feedback nonlinear systems with time delay. The system coefficient and uncertain functions of our considered systems are both unknown. By employing neural networks to approximate the unknown function entries, and constructing a new input-driven filter, a backstepping design method of tracking controller is developed for the systems under consideration. The proposed controller can guarantee that all the signals in the closed-loop systems are ultimately bounded, and the time-varying target signal can be tracked within a small error as well. The main con…

0209 industrial biotechnologyComputer scienceMIMOAdaptive trackingoutput-feedback controller02 engineering and technologyNonlinear controlmultiple-input and multiple-output (MIMO)020901 industrial engineering & automationControl theoryAdaptive system0202 electrical engineering electronic engineering information engineeringElectrical and Electronic EngineeringArtificial neural networkControl engineeringComputer Science Applications1707 Computer Vision and Pattern RecognitionFilter (signal processing)neural networksComputer Science ApplicationsHuman-Computer InteractionNonlinear systemControl and Systems EngineeringBackstepping020201 artificial intelligence & image processingAdaptive tracking; multiple-input and multiple-output (MIMO); neural networks; output-feedback controller; Control and Systems Engineering; Software; Information Systems; Human-Computer Interaction; Computer Science Applications1707 Computer Vision and Pattern Recognition; Electrical and Electronic EngineeringSoftwareInformation Systems
researchProduct

Smart sensing and adaptive reasoning for enabling industrial robots with interactive human-robot capabilities in dynamic environments — a case study

2019

Traditional industry is seeing an increasing demand for more autonomous and flexible manufacturing in unstructured settings, a shift away from the fixed, isolated workspaces where robots perform predefined actions repetitively. This work presents a case study in which a robotic manipulator, namely a KUKA KR90 R3100, is provided with smart sensing capabilities such as vision and adaptive reasoning for real-time collision avoidance and online path planning in dynamically-changing environments. A machine vision module based on low-cost cameras and color detection in the hue, saturation, value (HSV) space is developed to make the robot aware of its changing environment. Therefore, this vision a…

0209 industrial biotechnologyComputer scienceMachine visionTKReal-time computingRobot manipulator02 engineering and technologyWorkspaceAdaptive Reasoninglcsh:Chemical technologyBiochemistryHuman–robot interactionArticleAnalytical ChemistrySettore ING-IND/14 - Progettazione Meccanica E Costruzione Di Macchinehuman-robot interaction020901 industrial engineering & automation0202 electrical engineering electronic engineering information engineeringlcsh:TP1-1185Motion planningElectrical and Electronic EngineeringInstrumentationpath planningCollision avoidancerobot controlsmart sensingAdaptive reasoningdynamic environmentsAtomic and Molecular Physics and OpticsRobot control:Engineering::Mechanical engineering [DRNTU]ObstacleDynamic EnvironmentsRobot020201 artificial intelligence & image processingadaptive reasoning
researchProduct

Selective visual odometry for accurate AUV localization

2015

In this paper we present a stereo visual odometry system developed for autonomous underwater vehicle localization tasks. The main idea is to make use of only highly reliable data in the estimation process, employing a robust keypoint tracking approach and an effective keyframe selection strategy, so that camera movements are estimated with high accuracy even for long paths. Furthermore, in order to limit the drift error, camera pose estimation is referred to the last keyframe, selected by analyzing the feature temporal flow. The proposed system was tested on the KITTI evaluation framework and on the New Tsukuba stereo dataset to assess its effectiveness on long tracks and different illumina…

0209 industrial biotechnologyComputer scienceVisual odometryComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION02 engineering and technologyKeyframe selectionRANSAC020901 industrial engineering & automationOdometryArtificial Intelligence0202 electrical engineering electronic engineering information engineeringComputer vision14. Life underwaterVisual odometryUnderwaterAUVPoseSettore ING-INF/05 - Sistemi Di Elaborazione Delle InformazioniRANSACSettore INF/01 - InformaticaFeature matchingbusiness.industryProcess (computing)StereoFeature (computer vision)020201 artificial intelligence & image processingArtificial intelligenceUnderwaterbusinessStereo cameraAutonomous Robots
researchProduct

Rapid and robust on-site evaluation of articulated arm coordinate measuring machine performance

2018

International audience

0209 industrial biotechnologyComputer sciencebusiness.industryApplied Mathematics02 engineering and technologySite evaluationCoordinate-measuring machine01 natural sciences[SPI.MECA.GEME]Engineering Sciences [physics]/Mechanics [physics.med-ph]/Mechanical engineering [physics.class-ph]010309 optics020901 industrial engineering & automation0103 physical sciencesComputer visionArtificial intelligencebusiness[SPI.MECA.GEME] Engineering Sciences [physics]/Mechanics [physics.med-ph]/Mechanical engineering [physics.class-ph]InstrumentationEngineering (miscellaneous)ComputingMilieux_MISCELLANEOUS
researchProduct

2D/3D Object Recognition and Categorization Approaches for Robotic Grasping

2017

International audience; Object categorization and manipulation are critical tasks for a robot to operate in the household environment. In this paper, we propose new methods for visual recognition and categorization. We describe 2D object database and 3D point clouds with 2D/3D local descriptors which we quantify with the k-means clustering algorithm for obtaining the Bag of Words (BOW). Moreover, we develop a new global descriptor called VFH-Color that combines the original version of Viewpoint Feature Histogram (VFH) descriptor with the color quantization histogram, thus adding the appearance information that improves the recognition rate. The acquired 2D and 3D features are used for train…

0209 industrial biotechnologyComputer sciencebusiness.industryComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONCognitive neuroscience of visual object recognition[INFO.INFO-CV]Computer Science [cs]/Computer Vision and Pattern Recognition [cs.CV]02 engineering and technology[ INFO.INFO-CV ] Computer Science [cs]/Computer Vision and Pattern Recognition [cs.CV]Color quantizationDeep belief network[INFO.INFO-CV] Computer Science [cs]/Computer Vision and Pattern Recognition [cs.CV]ComputingMethodologies_PATTERNRECOGNITION020901 industrial engineering & automationCategorizationBag-of-words modelHistogram0202 electrical engineering electronic engineering information engineering020201 artificial intelligence & image processingComputer visionArtificial intelligenceCluster analysisbusinessClassifier (UML)
researchProduct

Visual tracking with omnidirectional cameras: an efficient approach

2011

International audience; An effective technique for applying visual tracking algorithms to omni- directional image sequences is presented. The method is based on a spherical image representation which allows taking into account the distortions and nonlinear resolution of omnidirectional images. Experimental results show that both deterministic and probabilistic tracking methods can effectively be adapted in order to robustly track an object with an omnidirectional camera.

0209 industrial biotechnologyComputer sciencebusiness.industryComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONPhysics::Optics[INFO.INFO-CV]Computer Science [cs]/Computer Vision and Pattern Recognition [cs.CV]02 engineering and technologyTracking (particle physics)[ INFO.INFO-CV ] Computer Science [cs]/Computer Vision and Pattern Recognition [cs.CV]Computer Science::RoboticsNonlinear system020901 industrial engineering & automationOmnidirectional cameraComputer Science::Computer Vision and Pattern Recognition0202 electrical engineering electronic engineering information engineeringEye tracking020201 artificial intelligence & image processingComputer visionArtificial intelligenceElectrical and Electronic EngineeringRepresentation (mathematics)Omnidirectional antennabusiness
researchProduct

Real-time human collision detection for industrial robot cells

2017

A collision detection system triggering on human motion was developed using the Robot Operating System (ROS) and the Point Cloud Library (PCL). ROS was used as the core of the programs and for the communication with an industrial robot. Combining the depths fields from the 3D cameras was accomplished by the use of PCL. The library was also the underlying tool for segmenting the human from the registrated point clouds. Benchmarking of several collision algorithms was done in order to compare the solution. The registration process gave satisfactory results when testing the repetitiveness and the accuracy of the implementation. The segmentation algorithm was able to segment a person represente…

0209 industrial biotechnologyComputer sciencebusiness.industryPoint cloudProcess (computing)02 engineering and technologyBenchmarkingCollisionlaw.inventionIndustrial robot020901 industrial engineering & automationlawCollision detectionComputer visionSegmentationArtificial intelligencebusinessCollision avoidance2017 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI)
researchProduct

3D Point Cloud Descriptor for Posture Recognition

2018

International audience

0209 industrial biotechnologyComputer sciencebusiness.industryPosture recognitionPoint cloud[INFO.INFO-CV]Computer Science [cs]/Computer Vision and Pattern Recognition [cs.CV]02 engineering and technology[ INFO.INFO-CV ] Computer Science [cs]/Computer Vision and Pattern Recognition [cs.CV]020901 industrial engineering & automation[INFO.INFO-TI]Computer Science [cs]/Image Processing [eess.IV][ INFO.INFO-TI ] Computer Science [cs]/Image Processing0202 electrical engineering electronic engineering information engineering020201 artificial intelligence & image processingComputer visionArtificial intelligencebusinessComputingMilieux_MISCELLANEOUS
researchProduct