Show simple item record

dc.contributor.authorGraña Romay, Manuel María
dc.contributor.authorAguilar Moreno, Marina
dc.contributor.authorDe Lope Asiaín, Javier
dc.contributor.authorBaglietto Araquistain, Ibai
dc.contributor.authorGarmendia, Xavier
dc.date.accessioned2021-02-18T13:03:49Z
dc.date.available2021-02-18T13:03:49Z
dc.date.issued2020-10
dc.identifier.citationInternational Journal Of Neural Systems 30(10) : (2020) // Article ID 2050053es_ES
dc.identifier.issn0129-0657
dc.identifier.issn1793-6462
dc.identifier.urihttp://hdl.handle.net/10810/50210
dc.description.abstractHuman activity recognition and neural activity analysis are the basis for human computational neureoethology research dealing with the simultaneous analysis of behavioral ethogram descriptions and neural activity measurements. Wireless electroencephalography (EEG) and wireless inertial measurement units (IMU) allow the realization of experimental data recording with improved ecological validity where the subjects can be carrying out natural activities while data recording is minimally invasive. Specifically, we aim to show that EEG and IMU data fusion allows improved human activity recognition in a natural setting. We have defined an experimental protocol composed of natural sitting, standing and walking activities, and we have recruited subjects in two sites: in-house (N = 4) and out-house (N = 12) populations with different demographics. Experimental protocol data capture was carried out with validated commercial systems. Classifier model training and validation were carried out with scikit-learn open source machine learning python package. EEG features consist of the amplitude of the standard EEG frequency bands. Inertial features were the instantaneous position of the body tracked points after a moving average smoothing to remove noise. We carry out three validation processes: a 10-fold cross-validation process per experimental protocol repetition, (b) the inference of the ethograms, and (c) the transfer learning from each experimental protocol repetition to the remaining repetitions. The in-house accuracy results were lower and much more variable than the out-house sessions results. In general, random forest was the best performing classifier model. Best cross-validation results, ethogram accuracy, and transfer learning were achieved from the fusion of EEG and IMUs data. Transfer learning behaved poorly compared to classification on the same protocol repetition, but it has accuracy still greater than 0.75 on average for the out-house data sessions. Transfer leaning accuracy among repetitions of the same subject was above 0.88 on average. Ethogram prediction accuracy was above 0.96 on average. Therefore, we conclude that wireless EEG and IMUs allow for the definition of natural experimental designs with high ecological validity toward human computational neuroethology research. The fusion of both EEG and IMUs signals improves activity and ethogram recognitiones_ES
dc.description.sponsorshipThis work has been partially supported by FEDER funds through MINECO Project TIN2017-85827-P. Special thanks to Naiara Vidal from IMH who conducted the recruitment process in the framework of Langileok project funded by the Elkartek program. This project has received funding from the European Union's Horizon 2020 research and innovation programme under the Marie Sklodowska-Curie grant agreement No. 777720.es_ES
dc.language.isoenges_ES
dc.publisherWorld Scientific Publishinges_ES
dc.relationinfo:eu-repo/grantAgreement/EC/H2020/777720es_ES
dc.relationinfo:eu-repo/grantAgreement/MINECO/TIN2017-85827-Pes_ES
dc.rightsinfo:eu-repo/semantics/openAccesses_ES
dc.rights.urihttp://creativecommons.org/licenses/by/3.0/es/*
dc.subjectneuroethologyes_ES
dc.subjectactivity recognitiones_ES
dc.subjectEEGes_ES
dc.subjectinertial measurementes_ES
dc.subjecttrackinges_ES
dc.subjectaccuracyes_ES
dc.subjectsystemes_ES
dc.subjectfieldes_ES
dc.titleImproved Activity Recognition Combining Inertial Motion Sensors and Electroencephalogram Signalses_ES
dc.typeinfo:eu-repo/semantics/articlees_ES
dc.rights.holderThis is an Open Access article published by World Scientific Publishing Company. It is distributed under the terms of the Creative Commons Attribution 4.0 (CC BY)es_ES
dc.rights.holderAtribución 3.0 España*
dc.relation.publisherversionhttps://www.worldscientific.com/doi/10.1142/S0129065720500537?url_ver=Z39.88-2003&rfr_id=ori%3Arid%3Acrossref.org&rfr_dat=cr_pub%3Dpubmed&es_ES
dc.identifier.doi10.1142/S0129065720500537
dc.contributor.funderEuropean Commission
dc.departamentoesCiencia de la computación e inteligencia artificiales_ES
dc.departamentoeuKonputazio zientziak eta adimen artifizialaes_ES


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record

This is an Open Access article published by World Scientific Publishing Company. It is distributed under the terms of the Creative Commons Attribution 4.0 (CC BY)
Except where otherwise noted, this item's license is described as This is an Open Access article published by World Scientific Publishing Company. It is distributed under the terms of the Creative Commons Attribution 4.0 (CC BY)