Papers
Topics
Authors
Recent
Search
2000 character limit reached

A short review and primer on the use of human voice in human computer interaction applications

Published 23 Sep 2016 in cs.HC | (1609.07343v1)

Abstract: The application of psychophysiologicy in human-computer interaction is a growing field with significant potential for future smart personalised systems. Working in this emerging field requires comprehension of an array of physiological signals and analysis techniques. Human speech affords, alongside linguistic content, rich information in the intonation, voice quality, prosody, and rhythmic variation of utterances, allowing listeners to recognise numerous distinct emotional states in the speaker. Several types of factors affect speech, ranging from emotions to cognitive load and pathological conditions, providing a promising non-intrusive source for online understanding of context and psychophysiological state. This paper aims to serve as a primer for the novice, enabling rapid familiarisation with the latest core concepts. We put special emphasis on everyday human-computer interface applications to distinguish from the more common clinical or sports uses of psychophysiology. This paper is an extract from a comprehensive review of the entire field of ambulatory psychophysiology, including 12 similar chapters, plus application guidelines and systematic review. Thus any citation should be made using the following reference: B. Cowley, M. Filetti, K. Lukander, J. Torniainen, A. Henelius, L. Ahonen, O. Barral, I. Kosunen, T. Valtonen, M. Huotilainen, N. Ravaja, G. Jacucci. The Psychophysiology Primer: a guide to methods and a broad review with a focus on human-computer interaction. Foundations and Trends in Human-Computer Interaction, vol. 9, no. 3-4, pp. 150--307, 2016.

Citations (1)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.