ISCA Archive Interspeech 2005
ISCA Archive Interspeech 2005

Tales of tuning - prototyping for automatic classification of emotional user states

Anton Batliner, Stefan Steidl, Christian Hacker, Elmar Nöth, Heinrich Niemann

Classification performance for emotional user states found in the few realistic, spontaneous databases available is as yet not very high. We present a database with emotional children's speech in a human-robot scenario. Baseline classification performance for seven classes is 44.5%, for four classes 59.2%. We discuss possible strategies for tuning, e.g., using only prototypes (based on annotation correspondence or classification scores), or taking into account requirements and feasibility in possible applications (weighting of false alarms or speaker-specific overall frequencies).


doi: 10.21437/Interspeech.2005-323

Cite as: Batliner, A., Steidl, S., Hacker, C., Nöth, E., Niemann, H. (2005) Tales of tuning - prototyping for automatic classification of emotional user states. Proc. Interspeech 2005, 489-492, doi: 10.21437/Interspeech.2005-323

@inproceedings{batliner05_interspeech,
  author={Anton Batliner and Stefan Steidl and Christian Hacker and Elmar Nöth and Heinrich Niemann},
  title={{Tales of tuning - prototyping for automatic classification of emotional user states}},
  year=2005,
  booktitle={Proc. Interspeech 2005},
  pages={489--492},
  doi={10.21437/Interspeech.2005-323}
}