Affect sensing in an affective interactive e-theatre for autistic children

Li Zhang

Research output: Chapter in Book/Report/Conference proceedingChapter

223 Downloads (Pure)

Abstract

In this paper we describe an interactive emotional
social virtual framework for autistic young people age 11 –
14 to learn verbal and non-verbal emotional expression in
role-play situations. In order to provide an interactive
learning environment with some degree of automatic
metaphorical understanding from open-ended text input,
new developments on affect detection for the processing of
several metaphorical languages have been presented. The
emotional gesture and facial animation have been created
for users’ avatars and activated by affective states detected
from users’ text input. Emotion appraisal and affect
detection functions have been embodied in an intelligent
conversational agent, who interacts with human users for
drama improvisation. We have also conducted user testing
with 24 autistic children to evaluate the overall framework.
The work has the potential to improve autistic young
people’s ability in language learning and their social
connection with other people. It contributes to the
conference themes on natural language processing,
interactive affective social interfaces and user studies of
intelligent interfaces.
Original languageEnglish
Title of host publicationProceedings of the 6th international conference on natural language processing and knowledge engineering, NLP-KE 2010
PublisherIEEE Computer Society
Pages1-8
ISBN (Print)9781424468966
DOIs
Publication statusPublished - 2010
Event2010 International Conference on Natural Language Processing and Knowledge Engineering (NLP-KE) - Beijing, China
Duration: 21 Aug 201023 Aug 2010

Conference

Conference2010 International Conference on Natural Language Processing and Knowledge Engineering (NLP-KE)
Period21/08/1023/08/10

Fingerprint

Dive into the research topics of 'Affect sensing in an affective interactive e-theatre for autistic children'. Together they form a unique fingerprint.

Cite this