Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Journal articles
  4. Intent Prediction Based on Biomechanical Coordination of EMG and Vision-Filtered Gaze for End-Point Control of an Arm Prosthesis
 
research article

Intent Prediction Based on Biomechanical Coordination of EMG and Vision-Filtered Gaze for End-Point Control of an Arm Prosthesis

Krausz, Nili E
•
Lammote, Dennys
•
Batzianoulis, Iason  
Show more
May 6, 2020
IEEE Transactions on Neural Systems and Rehabilitation Engineering

We propose a novel controller for powered prosthetic arms, where fused EMG and gaze data predict the desired end-point for a full arm prosthesis, which could drive the forward motion of individual joints. We recorded EMG, gaze, and motion-tracking during pick-and-place trials with 7 able-bodied subjects. Subjects positioned an object above a random target on a virtual interface, each completing around 600 trials. On average across all trials and subjects gaze preceded EMG and followed a repeatable pattern that allowed for prediction.A computer vision algorithm was used to extract the initial and target fixations and estimate the target position in 2D space. Two SVRs were trained with EMG data to predict the x- and y- position of the hand; results showed that the y-estimate was significantly better than the x-estimate. The EMG and gaze predictions were fused using a Kalman Filter-based approach, and the positional error from using EMG-only was significantly higher than the fusion of EMG and gaze. The final target position Root Mean Squared Error (RMSE) decreased from 9.28 cm with an EMG-only prediction to 6.94 cm when using a gaze-EMG fusion. This error also increased significantly when removing some or all arm muscle signals. However, using fused EMG and gaze, there were no significant difference between predictors that included all muscles, or only a subset of muscles.

  • Files
  • Details
  • Metrics
Type
research article
DOI
10.1109/TNSRE.2020.2992885
Author(s)
Krausz, Nili E
Lammote, Dennys
Batzianoulis, Iason  
Hargrove, Levi
Micera, Silvestro  
Billard, Aude  orcid-logo
Corporate authors
Lamotte, Dennys
•
Batzianoulis, Iason
•
Hargrove, Levi
•
Micera, Silvestro
•
Billard, Aude
Date Issued

2020-05-06

Published in
IEEE Transactions on Neural Systems and Rehabilitation Engineering
Volume

28

Issue

6

Start page

1471

End page

1480

Subjects

Prosthetics

•

Upper limb prosthesis

•

Electromyography

•

Gaze tracking

•

Sensory fusion

•

Computer vision

•

End-point control

•

Kalman Filter

Editorial or Peer reviewed

REVIEWED

Written at

EPFL

EPFL units
LASA  
NCCR-ROBOTICS  
TNE  
RelationURL/DOI

References

https://infoscience.epfl.ch/record/266016?ln=en
Available on Infoscience
June 9, 2020
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/169178
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés