000256245 001__ 256245
000256245 005__ 20190507143844.0
000256245 0247_ $$2doi$$a10.1109/TETCI.2018.2848289
000256245 02470 $$a10.1109/TETCI.2018.2848289$$2DOI
000256245 037__ $$aARTICLE
000256245 245__ $$aDecoding Neural Correlates of Cognitive States to Enhance Driving Experience
000256245 260__ $$c2018-07-20
000256245 269__ $$a2018-07-20
000256245 336__ $$aJournal Articles
000256245 520__ $$aModern cars can support their drivers by assessing and autonomously performing different driving maneuvers based on information gathered by in-car sensors. We propose that brain–machine interfaces (BMIs) can provide complementary information that can ease the interaction with intelligent cars in order to enhance the driving experience. In our approach, the human remains in control, while a BMI is used to monitor the driver's cognitive state and use that information to modulate the assistance provided by the intelligent car. In this paper, we gather our proof-of-concept studies demonstrating the feasibility of decoding electroencephalography correlates of upcoming actions and those reflecting whether the decisions of driving assistant systems are in-line with the drivers' intentions. Experimental results while driving both simulated and real cars consistently showed neural signatures of anticipation, movement preparation, and error processing. Remarkably, despite the increased noise inherent to real scenarios, these signals can be decoded on a single-trial basis, reflecting some of the cognitive processes that take place while driving. However, moderate decoding performance compared to the controlled experimental BMI paradigms indicate there exists room for improvement of the machine learning methods typically used in the state-of-the-art BMIs. We foresee that neural fusion correlates with information extracted from other physiological measures, e.g., eye movements or electromyography as well as contextual information gathered by in-car sensors will allow intelligent cars to provide timely and tailored assistance only if it is required; thus, keeping the user in the loop and allowing him to fully enjoy the driving experience.
000256245 6531_ $$aDecoding
000256245 6531_ $$aElectroencephalography
000256245 6531_ $$aAutomobiles
000256245 6531_ $$aCognition
000256245 6531_ $$aSymbiosis
000256245 6531_ $$aBrain-computer interfaces
000256245 700__ $$g137762$$0241256$$aChavarriaga, Ricardo
000256245 700__ $$aUscumlic, Marija
000256245 700__ $$aZhang, Huaijian
000256245 700__ $$aKhaliliardali, Zahra
000256245 700__ $$aAydarkhanov, Ruslan
000256245 700__ $$aSaeedi, Sareh
000256245 700__ $$aGheorghe, Lucian
000256245 700__ $$aMillan, Jose del R.
000256245 773__ $$q288-297$$k4$$j2$$tIEEE Transactions on Emerging Topics in Computational Intelligence
000256245 8560_ $$fricardo.chavarriaga@epfl.ch
000256245 8564_ $$uhttps://infoscience.epfl.ch/record/256245/files/Chavarriaga_IEEE_TETCI_2018_FINAL.pdf$$zFinal$$s14160616
000256245 909C0 $$pCNBI$$mricardo.chavarriaga@epfl.ch$$mjose.millan@epfl.ch$$xU12103$$0252018
000256245 909C0 $$xU12367$$pNCCR-ROBOTICS$$0252409
000256245 909CO $$qGLOBAL_SET$$pSTI$$particle$$ooai:infoscience.epfl.ch:256245
000256245 960__ $$aricardo.chavarriaga@epfl.ch
000256245 961__ $$amanon.velasco@epfl.ch
000256245 973__ $$aEPFL$$sPUBLISHED$$rREVIEWED
000256245 980__ $$aARTICLE
000256245 981__ $$aoverwrite
000256245 999C0 $$xU12599$$pCNP$$mbruno.herbelin@epfl.ch$$0252517