000177810 001__ 177810
000177810 005__ 20190316235408.0
000177810 0247_ $$2doi$$a10.1088/1741-2560/9/4/045011
000177810 022__ $$a1741-2560
000177810 02470 $$2ISI$$a000306759600012
000177810 037__ $$aARTICLE
000177810 245__ $$aTime-Dependent Approach for Single Trial Classification of Covert Visuospatial Attention
000177810 269__ $$a2012
000177810 260__ $$bInstitute of Physics$$c2012
000177810 336__ $$aJournal Articles
000177810 520__ $$aRecently, several studies have started to explore covert visuospatial attention as a control signal for brain–computer interfaces (BCIs). Covert visuospatial attention represents the ability to change the focus of attention from one point in the space without overt eye movements. Nevertheless, the full potential and possible applications of this paradigm remain relatively unexplored. Voluntary covert visuospatial attention might allow a more natural and intuitive interaction with real environments as neither stimulation nor gazing is required. In order to identify brain correlates of covert visuospatial attention, classical approaches usually rely on the whole α-band over long time intervals. In this work, we propose a more detailed analysis in the frequency and time domains to enhance classification performance. In particular, we investigate the contribution of α sub-bands and the role of time intervals in carrying information about visual attention. Previous neurophysiological studies have already highlighted the role of temporal dynamics in attention mechanisms. However, these important aspects are not yet exploited in BCI. In this work, we studied different methods that explicitly cope with the natural brain dynamics during visuospatial attention tasks in order to enhance BCI robustness and classification performances. Results with ten healthy subjects demonstrate that our approach identifies spectro-temporal patterns that outperform the state-of-the-art classification method. On average, our time-dependent classification reaches 0.74 ± 0.03 of the area under the ROC (receiver operating characteristic) curve (AUC) value with an increase of 12.3% with respect to standard methods (0.65 ± 0.4). In addition, the proposed approach allows faster classification (<1 instead of 3 s), without compromising performances. Finally, our analysis highlights the fact that discriminant patterns are not stable for the whole trial period but are changing over short time intervals. These results support the hypothesis that visual attention information is actually indexed by subject-specific α sub-bands and is time dependent.
000177810 6531_ $$aBrain--Computer Interface
000177810 6531_ $$aCovert visuospatial attention
000177810 6531_ $$aSingle Trial
000177810 6531_ $$aEEG
000177810 700__ $$0242175$$g190240$$aTonin, Luca
000177810 700__ $$0242179$$g192497$$aLeeb, Robert
000177810 700__ $$aMillán, José del R.$$g149175$$0240030
000177810 773__ $$j9$$tJournal of Neural Engineering$$k4$$q045011
000177810 8564_ $$uhttps://infoscience.epfl.ch/record/177810/files/1741-2552_9_4_045011.pdf$$zPublisher's version$$s1394670$$yPublisher's version
000177810 909C0 $$xU12103$$0252018$$pCNBI
000177810 909C0 $$pCNP$$xU12599$$0252517
000177810 909CO $$qGLOBAL_SET$$pSTI$$particle$$ooai:infoscience.tind.io:177810
000177810 917Z8 $$x190240
000177810 917Z8 $$x190240
000177810 917Z8 $$x149175
000177810 917Z8 $$x190240
000177810 937__ $$aEPFL-ARTICLE-177810
000177810 973__ $$rREVIEWED$$sPUBLISHED$$aEPFL
000177810 980__ $$aARTICLE