Show simple item record

dc.contributor.authorIrastorza, Jon
dc.contributor.authorTorres Barañano, María Inés ORCID
dc.date.accessioned2019-04-30T15:55:50Z
dc.date.available2019-04-30T15:55:50Z
dc.date.issued2018-08-26
dc.identifier.citationTopics in Intelligent Engineering and Informatics 13 : 131-151 (2019)es_ES
dc.identifier.isbn978-3-319-95995-5
dc.identifier.issn2193-9411
dc.identifier.urihttp://hdl.handle.net/10810/32594
dc.description.abstractMachine learning researchers have dealt with the identification of emo- tional cues from speech since it is research domain showing a large number of po- tential applications. Many acoustic parameters have been analyzed when searching for cues to identify emotional categories. Then classical classifiers and also out- standing computational approaches have been developed. Experiments have been carried out mainly over induced emotions, even if recently research is shifting to work over spontaneous emotions. In such a framework, it is worth mentioning that the expression of spontaneous emotions depends on cultural factors, on the particu- lar individual and also on the specific situation. In this work, we were interested in the emotional shifts during conversation. In particular we were aimed to track the annoyance shifts appearing in phone conversations to complaint services. To this end we analyzed a set of audio files showing different ways to express annoyance. The call center operators found disappointment, impotence or anger as expression of annoyance. However, our experiments showed that variations of parameters derived from intensity combined with some spectral information and suprasegmental fea- tures are very robust for each speaker and annoyance rate. The work also discussed the annotation problem arising when dealing with human labelling of subjective events. In this work we proposed an extended rating scale in order to include anno- tators disagreements. Our frame classification results validated the chosen annota- tion procedure. Experimental results also showed that shifts in customer annoyance rates could be potentially tracked during phone callses_ES
dc.description.sponsorshipSpanish Mineco under grant TIN2014- 54288-C4-4-R H2020 EU under Empathic RIA action number 769872.es_ES
dc.language.isoenges_ES
dc.publisherSpringeres_ES
dc.relationinfo:eu-repo/grantAgreement/EC/H2020/769872es_ES
dc.relationinfo:eu-repo/grantAgreement/MINECO/TIN2014-54288-C4-4-Res_ES
dc.rightsinfo:eu-repo/semantics/openAccesses_ES
dc.subjectemotion from speeches_ES
dc.subjectintra-cognitive communicationes_ES
dc.subjectcall centerses_ES
dc.titleTracking the Expression of Annoyance in Call Centerses_ES
dc.typeinfo:eu-repo/semantics/bookPartes_ES
dc.typeinfo:eu-repo/semantics/conferenceObject
dc.rights.holder© Springer International Publishing AG, part of Springer Nature 2019es_ES
dc.relation.publisherversionhttps://link.springer.com/chapter/10.1007/978-3-319-95996-2_7es_ES
dc.identifier.doi10.1007/978-3-319-95996-2_7
dc.contributor.funderEuropean Commission
dc.departamentoesElectricidad y electrónicaes_ES
dc.departamentoeuElektrizitatea eta elektronikaes_ES


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record