Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Journal articles
  4. Real-time and Recursive Estimators for Functional MRI Quality Assessment
 
research article

Real-time and Recursive Estimators for Functional MRI Quality Assessment

Davydov, Nikita
•
Peek, Lucas
•
Auer, Tibor
Show more
March 17, 2022
Neuroinformatics

Real-time quality assessment (rtQA) of functional magnetic resonance imaging (fMRI) based on blood oxygen level-dependent (BOLD) signal changes is critical for neuroimaging research and clinical applications. The losses of BOLD sensitivity because of different types of technical and physiological noise remain major sources of fMRI artifacts. Due to difficulty of subjective visual perception of image distortions during data acquisitions, a comprehensive automatic rtQA is needed. To facilitate rapid rtQA of fMRI data, we applied real-time and recursive quality assessment methods to whole-brain fMRI volumes, as well as time-series of target brain areas and resting-state networks. We estimated recursive temporal signal-to-noise ratio (rtSNR) and contrast-to-noise ratio (rtCNR), and real-time head motion parameters by a framewise rigid-body transformation (translations and rotations) using the conventional current to template volume registration. In addition, we derived real-time framewise (FD) and micro (MD) displacements based on head motion parameters and evaluated the temporal derivative of root mean squared variance over voxels (DVARS). For monitoring time-series of target regions and networks, we estimated the number of spikes and amount of filtered noise by means of a modified Kalman filter. Finally, we applied the incremental general linear modeling (GLM) to evaluate real-time contributions of nuisance regressors (linear trend and head motion). Proposed rtQA was demonstrated in real-time fMRI neurofeedback runs without and with excessive head motion and real-time simulations of neurofeedback and resting-state fMRI data. The rtQA was implemented as an extension of the open-source OpenNFT software written in Python, MATLAB and C++ for neurofeedback, task-based, and resting-state paradigms. We also developed a general Python library to unify real-time fMRI data processing and neurofeedback applications. Flexible estimation and visualization of rtQA facilitates efficient rtQA of fMRI data and helps the robustness of fMRI acquisitions by means of substantiating decisions about the necessity of the interruption and re-start of the experiment and increasing the confidence in neural estimates.

  • Details
  • Metrics
Type
research article
DOI
10.1007/s12021-022-09582-7
Web of Science ID

WOS:000769820300001

Author(s)
Davydov, Nikita
Peek, Lucas
Auer, Tibor
Prilepin, Evgeny
Gninenko, Nicolas  
Van de Ville, Dimitri  
Nikonorov, Artem
Koush, Yury
Date Issued

2022-03-17

Publisher

HUMANA PRESS INC

Published in
Neuroinformatics
Subjects

Computer Science, Interdisciplinary Applications

•

Neurosciences

•

Computer Science

•

Neurosciences & Neurology

•

real-time quality assessment

•

recursive

•

functional mri

•

task

•

rest

•

neurofeedback paradigms

•

opennft

•

rtspm python library

•

to-noise ratio

•

head motion

•

physiological noise

•

image registration

•

fmri data

•

connectivity

•

neurofeedback

•

assurance

•

sensitivity

•

artifacts

Editorial or Peer reviewed

REVIEWED

Written at

EPFL

EPFL units
MIPLAB  
Available on Infoscience
April 11, 2022
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/187066
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés