Memorias de investigación
Artículos en revistas:
A Satisfaction-based Model for Affect Recognition from Conversational Features in Spoken Dialog Systems
Año:2013

Áreas de investigación
  • Inteligencia artificial,
  • Percepción del habla,
  • Tecnología electrónica y de las comunicaciones,
  • Ingeniería eléctrica, electrónica y automática,
  • Competencia social, afectiva y emocional

Datos
Descripción
Detecting user affect automatically during real-time conversation is the main challenge towards our greater aim of infusing social intelligence into a natural-language mixed-initiative High-Fidelity (Hi-Fi) audio control spoken dialog agent. In recent years, studies on affect detection from voice have moved on to using realistic, non-acted data, which is subtler. However, it is more challenging to perceive subtler emotions and this is demonstrated in tasks such as labelling and machine prediction. This paper attempts to address part of this challenge by considering the role of user satisfaction ratings and also conversational/dialog features in discriminating contentment and frustration, two types of emotions that are known to be prevalent within spoken human-computer interaction. However, given the laboratory constraints, users might be positively biased when rating the system, indirectly making the reliability of the satisfaction data questionable. Machine learning experiments were conducted on two datasets, users and annotators, which were then compared in order to assess the reliability of these datasets. Our results indicated that standard classifiers were significantly more successful in discriminating the abovementioned emotions and their intensities (reflected by user satisfaction ratings) from annotator data than from user data. These results corroborated that: first, satisfaction data could be used directly as an alternative target variable to model affect, and that they could be predicted exclusively by dialog features. Second, these were only true when trying to predict the abovementioned emotions using annotator?s data, suggesting that user bias does exist in a laboratory-led evaluation.
Internacional
Si
JCR del ISI
Si
Título de la revista
Speech Communication
ISSN
0167-6393
Factor de impacto JCR
1,267
Información de impacto
Datos JCR del año 2011
Volumen
55
DOI
Número de revista
7-8
Desde la página
825
Hasta la página
840
Mes
SIN MES
Ranking
13/30

Esta actividad pertenece a memorias de investigación

Participantes
  • Autor: Syaheerah Binti Lebai Lutfi UPM
  • Autor: Fernando Fernández Martínez Universidad Carlos III de Madrid
  • Autor: Juan Manuel Lucas Cuesta UPM
  • Autor: Lorena López Lebón Empresa Altran
  • Autor: Juan Manuel Montero Martinez UPM

Grupos de investigación, Departamentos, Centros e Institutos de I+D+i relacionados
  • Creador: Grupo de Investigación: Grupo de Tecnología del Habla
  • Departamento: Ingeniería Electrónica