6533b85cfe1ef96bd12bc9e3

RESEARCH PRODUCT

Hankelet-based action classification for motor intention recognition

Remzo DedićMarco La CasciaLiliana Lo PrestiAntonio ChellaHaris Dindo

subject

0209 industrial biotechnologyComputer scienceGeneral MathematicsInterface (computing)Context (language use)02 engineering and technologyAction recognitionLTI system theoryMatrix (mathematics)020901 industrial engineering & automationMatch moving0202 electrical engineering electronic engineering information engineeringMathematics (all)Computer visionObservabilitySettore ING-INF/05 - Sistemi Di Elaborazione Delle Informazionibusiness.industrySystem identificationComputer Science Applications1707 Computer Vision and Pattern RecognitionAction recognition; Motor intention recognition; Powered (active) lower-limb prostheses; Wearable sensor networks; Control and Systems Engineering; Software; Mathematics (all); Computer Science Applications1707 Computer Vision and Pattern RecognitionMotor intention recognitionComputer Science ApplicationsSupport vector machineControl and Systems EngineeringPowered (active) lower-limb prostheseWearable sensor network020201 artificial intelligence & image processingArtificial intelligencebusinessHankel matrixSoftware

description

Powered lower-limb prostheses require a natural, and an easy-to-use, interface for communicating amputee’s motor intention in order to select the appropriate motor program in any given context, or simply to commute from active (powered) to passive mode of functioning. To be widely accepted, such an interface should not put additional cognitive load at the end-user, it should be reliable and minimally invasive. In this paper we present a one such interface based on a robust method for detecting and recognizing motor actions from a low-cost wearable sensor network mounted on a sound leg providing inertial (accelerometer, gyrometer and magnetometer) data in real-time. We assume that the sensor measurement trajectories – in a given temporal window – can be represented as the output of a linear time invariant system. We describe such set of trajectories via a Hankel matrix, which embeds the observability matrix of the LTI system generating the set of trajectories. The use of Hankel matrices (known as Hankelets) avoids the burden of performing system identification while providing a computationally convenient descriptor for the dynamics of a time-series. For the recognition of actions, we use two off-the-shelf classifiers, nearest neighbor (NN) and support vector machines (SVM), in cross-subject validation. We present results using either the joint angles or the raw sensor data showing a net improvement of the Hankelet-based approach against a baseline method. In addition, we compare results on action recognition using joint angles provided by trakSTAR, a high-accuracy motion tracking unit, demonstrating – somewhat surprisingly – that best results (in terms of average recognition accuracy over different actions) are provided by raw inertial data, paving the way towards a wider usage of our method in the field of active prosthetics, in particular, and motor intention recognition, in general.

https://doi.org/10.1016/j.robot.2017.04.003