Please use this identifier to cite or link to this item: https://doi.org/10.21256/zhaw-22244
Full metadata record
DC FieldValueLanguage
dc.contributor.authorAydarkhanov, Ruslan-
dc.contributor.authorUšćumlić, Marija-
dc.contributor.authorChavarriaga, Ricardo-
dc.contributor.authorGheorghe, Lucian-
dc.contributor.authorMillán, José del R-
dc.date.accessioned2021-04-09T09:59:39Z-
dc.date.available2021-04-09T09:59:39Z-
dc.date.issued2021-
dc.identifier.issn1741-2552de_CH
dc.identifier.issn1741-2560de_CH
dc.identifier.urihttps://digitalcollection.zhaw.ch/handle/11475/22244-
dc.description.abstractObjective. In contrast to the classical visual brain–computer interface (BCI) paradigms, which adhere to a rigid trial structure and restricted user behavior, electroencephalogram (EEG)-based visual recognition decoding during our daily activities remains challenging. The objective of this study is to explore the feasibility of decoding the EEG signature of visual recognition in experimental conditions promoting our natural ocular behavior when interacting with our dynamic environment. Approach. In our experiment, subjects visually search for a target object among suddenly appearing objects in the environment while driving a car-simulator. Given that subjects exhibit an unconstrained overt visual behavior, we based our study on eye fixation-related potentials (EFRPs). We report on gaze behavior and single-trial EFRP decoding performance (fixations on visually similar target vs. non-target objects). In addition, we demonstrate the application of our approach in a closed-loop BCI setup. Main results. To identify the target out of four symbol types along a road segment, the BCI system integrated decoding probabilities of multiple EFRP and achieved the average online accuracy of 0.37 ± 0.06 (12 subjects), statistically significantly above the chance level. Using the acquired data, we performed a comparative study of classification algorithms (discriminating target vs. non-target) and feature spaces in a simulated online scenario. The EEG approaches yielded similar moderate performances of at most 0.6 AUC, yet statistically significantly above the chance level. In addition, the gaze duration (dwell time) appears to be an additional informative feature in this context. Significance. These results show that visual recognition of sudden events can be decoded during active driving. Therefore, this study lays a foundation for assistive and recommender systems based on the driver's brain signals.de_CH
dc.language.isoende_CH
dc.publisherIOP Publishingde_CH
dc.relation.ispartofJournal of Neural Engineeringde_CH
dc.rightsLicence according to publishing contractde_CH
dc.subjectBrain-computer interfacede_CH
dc.subjectDrivingde_CH
dc.subjectElectroencephalographyde_CH
dc.subjectEye trackingde_CH
dc.subjectVisual recognitionde_CH
dc.subject.ddc150: Psychologiede_CH
dc.titleClosed-loop EEG study on visual recognition during drivingde_CH
dc.typeBeitrag in wissenschaftlicher Zeitschriftde_CH
dcterms.typeTextde_CH
zhaw.departementSchool of Engineeringde_CH
zhaw.organisationalunitInstitut für Informatik (InIT)de_CH
dc.identifier.doi10.1088/1741-2552/abdfb2de_CH
dc.identifier.doi10.21256/zhaw-22244-
dc.identifier.pmid33494072de_CH
zhaw.funding.euNode_CH
zhaw.issue2de_CH
zhaw.originated.zhawYesde_CH
zhaw.pages.start026010de_CH
zhaw.publication.statusacceptedVersionde_CH
zhaw.volume18de_CH
zhaw.embargo.end2022-02-26de_CH
zhaw.publication.reviewPeer review (Publikation)de_CH
zhaw.webfeedDatalabde_CH
zhaw.webfeedInformation Engineeringde_CH
zhaw.webfeedMachine Perception and Cognitionde_CH
zhaw.author.additionalNode_CH
zhaw.display.portraitYesde_CH
Appears in collections:Publikationen School of Engineering

Files in This Item:
File Description SizeFormat 
2021_Aydarkhanov-etal_Closed-loop-EEG-study-visual-recognition-driving.pdfAccepted Version3.29 MBAdobe PDFThumbnail
View/Open
Show simple item record
Aydarkhanov, R., Ušćumlić, M., Chavarriaga, R., Gheorghe, L., & Millán, J. d. R. (2021). Closed-loop EEG study on visual recognition during driving. Journal of Neural Engineering, 18(2), 26010. https://doi.org/10.1088/1741-2552/abdfb2
Aydarkhanov, R. et al. (2021) ‘Closed-loop EEG study on visual recognition during driving’, Journal of Neural Engineering, 18(2), p. 026010. Available at: https://doi.org/10.1088/1741-2552/abdfb2.
R. Aydarkhanov, M. Ušćumlić, R. Chavarriaga, L. Gheorghe, and J. d. R. Millán, “Closed-loop EEG study on visual recognition during driving,” Journal of Neural Engineering, vol. 18, no. 2, p. 026010, 2021, doi: 10.1088/1741-2552/abdfb2.
AYDARKHANOV, Ruslan, Marija UŠĆUMLIĆ, Ricardo CHAVARRIAGA, Lucian GHEORGHE und José del R MILLÁN, 2021. Closed-loop EEG study on visual recognition during driving. Journal of Neural Engineering. 2021. Bd. 18, Nr. 2, S. 026010. DOI 10.1088/1741-2552/abdfb2
Aydarkhanov, Ruslan, Marija Ušćumlić, Ricardo Chavarriaga, Lucian Gheorghe, and José del R Millán. 2021. “Closed-Loop EEG Study on Visual Recognition during Driving.” Journal of Neural Engineering 18 (2): 26010. https://doi.org/10.1088/1741-2552/abdfb2.
Aydarkhanov, Ruslan, et al. “Closed-Loop EEG Study on Visual Recognition during Driving.” Journal of Neural Engineering, vol. 18, no. 2, 2021, p. 26010, https://doi.org/10.1088/1741-2552/abdfb2.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.