Show simple item record

dc.contributor.authorKamavuako, Ernest Nlandu
dc.contributor.authorSheikh, Usman Ayub
dc.contributor.authorGilani, Syed Omer
dc.contributor.authorJamil, Mohsin
dc.contributor.authorNiazi, Imran Khan
dc.date.accessioned2018-09-12T11:14:25Z
dc.date.available2018-09-12T11:14:25Z
dc.date.issued2018
dc.identifier.citationKamavuako, E.N., Sheikh, U.A., Gilani, S.O., Jamil, M., & Niazi, I.K. (2018). Classification of Overt and Covert Speech for Near-Infrared Spectroscopy-Based Brain Computer Interface. Sensors, 18, 2989. Doi:10.3390/s18092989es_ES
dc.identifier.issn1424-8220
dc.identifier.urihttp://hdl.handle.net/10810/28663
dc.descriptionPublished: 7 September 2018es_ES
dc.description.abstractPeople suffering from neuromuscular disorders such as locked-in syndrome (LIS) are left in a paralyzed state with preserved awareness and cognition. In this study, it was hypothesized that changes in local hemodynamic activity, due to the activation of Broca’s area during overt/covert speech, can be harnessed to create an intuitive Brain Computer Interface based on Near-Infrared Spectroscopy (NIRS). A 12-channel square template was used to cover inferior frontal gyrus and changes in hemoglobin concentration corresponding to six aloud (overtly) and six silently (covertly) spoken words were collected from eight healthy participants. An unsupervised feature extraction algorithm was implemented with an optimized support vector machine for classification. For all participants, when considering overt and covert classes regardless of words, classification accuracy of 92.88 18.49% was achieved with oxy-hemoglobin (O2Hb) and 95.14 5.39% with deoxy-hemoglobin (HHb) as a chromophore. For a six-active-class problem of overtly spoken words, 88.19 7.12% accuracy was achieved for O2Hb and 78.82 15.76% for HHb. Similarly, for a six-active-class classification of covertly spoken words, 79.17 14.30% accuracy was achieved with O2Hb and 86.81 9.90% with HHb as an absorber. These results indicate that a control paradigm based on covert speech can be reliably implemented into future Brain–Computer Interfaces (BCIs) based on NIRSes_ES
dc.description.sponsorshipThis research received no external funding.es_ES
dc.language.isoenges_ES
dc.publisherSensorses_ES
dc.rightsinfo:eu-repo/semantics/openAccesses_ES
dc.subjectbrain computer interfacees_ES
dc.subjectnear infrared spectroscopyes_ES
dc.subjectovert and covert speeches_ES
dc.subjectunsupervised feature extractiones_ES
dc.subjectBroca’s areaes_ES
dc.subjectdecoding speeches_ES
dc.titleClassification of Overt and Covert Speech for Near-Infrared Spectroscopy-Based Brain Computer Interfacees_ES
dc.typeinfo:eu-repo/semantics/articlees_ES
dc.rights.holder© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).es_ES
dc.relation.publisherversionhttp://www.mdpi.com/journal/sensorses_ES
dc.identifier.doi10.3390/s18092989


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record