Emotional input for character-based interactive storytelling

M. O. (Marc) Cavazza, D. (David) Pizzi, F. (Fred) Charles, T. (Thurid) Vogt, E. (Elisabeth) André

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In most Interactive Storytelling systems, user interaction is based on natural language communication with virtual agents, either through isolated utterances or through dialogue. Natural language communication is also an essential element of interactive narratives in which the user is supposed to impersonate one of the story’s characters. Whilst techniques for narrative generation and agent behaviour have made significant progress in recent years, natural language processing remains a bottleneck hampering the scalability of Interactive Storytelling systems. In this paper, we introduce a novel interaction technique based solely on emotional speech recognition. It allows the user to take part in dialogue with virtual actors without any constraints on style or expressivity, by mapping the recognised emotional categories to narrative situations and virtual characters feelings. Our Interactive Storytelling system uses an emotional planner to drive characters’ behaviours. The main feature of this approach is that characters’ feelings are part of the planning domain and are at the heart of narrative representations. The emotional speech recogniser analyses the speech signal to produce a variety of features which can be used to define ad-hoc categories on which to train the system. The content of our interactive narrative is an adaptation of one chapter of the XIXth century classic novel, Madame Bovary, which is well suited to a formalisation in terms of characters’ feelings. At various stages of the narrative, the user can address the main character or respond to her, impersonating her lover. The emotional category extracted from the user utterance can be analysed in terms of the current narrative context, which includes characters’ beliefs, feelings and expectations, to produce a specific influence on the target character, which will become visible through a change in its behaviour, achieving a high level of realism for the interaction. A limited number of emotional categories is sufficient to drive the narrative across multiple courses of actions, since it comprises over thirty narrative functions. We report results from a fully implemented prototype, both in terms of proof of concept and of usability through a preliminary user study.
Original languageEnglish
Title of host publicationProceedings of The 8th International Conference on Autonomous Agents and Multiagent Systems
EditorsCarles Sierra
PublisherInternational Foundation for Autonomous Agents
Pages313-320
ISBN (Print)9780981738161
Publication statusPublished - May 2009
Event8th International Conference on Autonomous Agents and Multi Agent Systems - Budapest, Hungary
Duration: 10 May 200915 May 2009
https://dl.acm.org/citation.cfm?id=1558013&picked=prox

Conference

Conference8th International Conference on Autonomous Agents and Multi Agent Systems
Abbreviated titleAAMAS '09
CountryHungary
CityBudapest
Period10/05/0915/05/09
Internet address

Bibliographical note

In most Interactive Storytelling systems, user interaction is based on natural language communication with virtual agents, either through isolated utterances or through dialogue. Natural language communication is also an essential element of interactive narratives in which the user is supposed to impersonate one of the story’s characters. Whilst techniques for narrative generation and agent behaviour have made significant progress in recent years, natural language processing remains a bottleneck hampering the scalability of Interactive Storytelling systems. In this paper, we introduce a novel interaction technique based solely on emotional speech recognition. It allows the user to take part in dialogue with virtual actors without any constraints on style or expressivity, by mapping the recognised emotional categories to narrative situations and virtual characters feelings. Our Interactive Storytelling system uses an emotional planner to drive characters’ behaviours. The main feature of this approach is that characters’ feelings are part of the planning domain and are at the heart of narrative representations. The emotional speech recogniser analyses the speech signal to produce a variety of features which can be used to define ad-hoc categories on which to train the system. The content of our interactive narrative is an adaptation of one chapter of the XIXth century classic novel, Madame Bovary, which is well suited to a formalisation in terms of characters’ feelings. At various stages of the narrative, the user can address the main character or respond to her, impersonating her lover. The emotional category extracted from the user utterance can be analysed in terms of the current narrative context, which includes characters’ beliefs, feelings and expectations, to produce a specific influence on the target character, which will become visible through a change in its behaviour, achieving a high level of realism for the interaction. A limited number of emotional categories is sufficient to drive the narrative across multiple courses of actions, since it comprises over thirty narrative functions. We report results from a fully implemented prototype, both in terms of proof of concept and of usability through a preliminary user study.

Fingerprint Dive into the research topics of 'Emotional input for character-based interactive storytelling'. Together they form a unique fingerprint.

  • Cite this

    Cavazza, M. O. M., Pizzi, D. D., Charles, F. F., Vogt, T. T., & André, E. E. (2009). Emotional input for character-based interactive storytelling. In C. Sierra (Ed.), Proceedings of The 8th International Conference on Autonomous Agents and Multiagent Systems (pp. 313-320). International Foundation for Autonomous Agents. http://delivery.acm.org/10.1145/1560000/1558056/p313-cavazza.pdf?ip=152.105.245.197&id=1558056&acc=ACTIVE%20SERVICE&key=BF07A2EE685417C5.7E81E7772CB8A87C.4D4702B0C3E38B35.4D4702B0C3E38B35&__acm__=1542209428_bb1277972c1fcb8dc45e19c7e715161c