Latency Correction of Error Potentials Between Different Experiments Reduces Calibration Time for Single-Trial Classification

One fundamental limitation of EEG-based brain- computer interfaces is the time needed to calibrate the system prior to the detection of signals, due to the wide variety of issues affecting the EEG measurements. For event-related potentials (ERP), one of these sources of variability is the application performed: Protocols with different cognitive workloads might yield to different latencies of the ERPs. In this sense, it is still not clear the effect that these latency variations have on the single-trial classification. This work studies the differences in the latencies of error potentials across three experiments with increasing cognitive workloads. A delay-correction algorithm based on the cross-correlation of the averaged signals is presented, and tested with a single-trial classification of the signals. The results showed that latency variations exist between different protocols, and that it is feasible to re-use data from previous experiments to calibrate a classifier able to detect the signals of a new experiment, thus reducing the calibration time.

Presented at:
34th International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC'12), San Diego, August 28 - September 1, 2012

Note: The status of this file is: Involved Laboratories Only

 Record created 2012-06-04, last modified 2019-08-12

Download fulltext

Rate this document:

Rate this document:
(Not yet reviewed)