Emotional input for character-based interactive storytelling

M. O. (Marc) Cavazza, D. (David) Pizzi, F. (Fred) Charles, T. (Thurid) Vogt, E. (Elisabeth) André

Research output: Chapter in Book/Report/Conference proceedingConference contributionResearch

Abstract

In most Interactive Storytelling systems, user interaction is based on natural language communication with virtual agents, either through isolated utterances or through dialogue. Natural language communication is also an essential element of interactive narratives in which the user is supposed to impersonate one of the story’s characters. Whilst techniques for narrative generation and agent behaviour have made significant progress in recent years, natural language processing remains a bottleneck hampering the scalability of Interactive Storytelling systems. In this paper, we introduce a novel interaction technique based solely on emotional speech recognition. It allows the user to take part in dialogue with virtual actors without any constraints on style or expressivity, by mapping the recognised emotional categories to narrative situations and virtual characters feelings. Our Interactive Storytelling system uses an emotional planner to drive characters’ behaviours. The main feature of this approach is that characters’ feelings are part of the planning domain and are at the heart of narrative representations. The emotional speech recogniser analyses the speech signal to produce a variety of features which can be used to define ad-hoc categories on which to train the system. The content of our interactive narrative is an adaptation of one chapter of the XIXth century classic novel, Madame Bovary, which is well suited to a formalisation in terms of characters’ feelings. At various stages of the narrative, the user can address the main character or respond to her, impersonating her lover. The emotional category extracted from the user utterance can be analysed in terms of the current narrative context, which includes characters’ beliefs, feelings and expectations, to produce a specific influence on the target character, which will become visible through a change in its behaviour, achieving a high level of realism for the interaction. A limited number of emotional categories is sufficient to drive the narrative across multiple courses of actions, since it comprises over thirty narrative functions. We report results from a fully implemented prototype, both in terms of proof of concept and of usability through a preliminary user study.
Original languageEnglish
Title of host publicationProceedings of The 8th International Conference on Autonomous Agents and Multiagent Systems
EditorsCarles Sierra
PublisherInternational Foundation for Autonomous Agents
Pages313-320
ISBN (Print)9780981738161
Publication statusPublished - May 2009
Event8th International Conference on Autonomous Agents and Multi Agent Systems - Budapest, Hungary
Duration: 10 May 200915 May 2009
https://dl.acm.org/citation.cfm?id=1558013&picked=prox

Conference

Conference8th International Conference on Autonomous Agents and Multi Agent Systems
Abbreviated titleAAMAS '09
CountryHungary
CityBudapest
Period10/05/0915/05/09
Internet address

Fingerprint

Communication
Speech recognition
Scalability
Planning
Processing

Bibliographical note

In most Interactive Storytelling systems, user interaction is based on natural language communication with virtual agents, either through isolated utterances or through dialogue. Natural language communication is also an essential element of interactive narratives in which the user is supposed to impersonate one of the story’s characters. Whilst techniques for narrative generation and agent behaviour have made significant progress in recent years, natural language processing remains a bottleneck hampering the scalability of Interactive Storytelling systems. In this paper, we introduce a novel interaction technique based solely on emotional speech recognition. It allows the user to take part in dialogue with virtual actors without any constraints on style or expressivity, by mapping the recognised emotional categories to narrative situations and virtual characters feelings. Our Interactive Storytelling system uses an emotional planner to drive characters’ behaviours. The main feature of this approach is that characters’ feelings are part of the planning domain and are at the heart of narrative representations. The emotional speech recogniser analyses the speech signal to produce a variety of features which can be used to define ad-hoc categories on which to train the system. The content of our interactive narrative is an adaptation of one chapter of the XIXth century classic novel, Madame Bovary, which is well suited to a formalisation in terms of characters’ feelings. At various stages of the narrative, the user can address the main character or respond to her, impersonating her lover. The emotional category extracted from the user utterance can be analysed in terms of the current narrative context, which includes characters’ beliefs, feelings and expectations, to produce a specific influence on the target character, which will become visible through a change in its behaviour, achieving a high level of realism for the interaction. A limited number of emotional categories is sufficient to drive the narrative across multiple courses of actions, since it comprises over thirty narrative functions. We report results from a fully implemented prototype, both in terms of proof of concept and of usability through a preliminary user study.

Cite this

Cavazza, M. O. M., Pizzi, D. D., Charles, F. F., Vogt, T. T., & André, E. E. (2009). Emotional input for character-based interactive storytelling. In C. Sierra (Ed.), Proceedings of The 8th International Conference on Autonomous Agents and Multiagent Systems (pp. 313-320). International Foundation for Autonomous Agents.
Cavazza, M. O. (Marc) ; Pizzi, D. (David) ; Charles, F. (Fred) ; Vogt, T. (Thurid) ; André, E. (Elisabeth). / Emotional input for character-based interactive storytelling. Proceedings of The 8th International Conference on Autonomous Agents and Multiagent Systems. editor / Carles Sierra. International Foundation for Autonomous Agents, 2009. pp. 313-320
@inproceedings{6476126beaa243c487ff91487562e832,
title = "Emotional input for character-based interactive storytelling",
abstract = "In most Interactive Storytelling systems, user interaction is based on natural language communication with virtual agents, either through isolated utterances or through dialogue. Natural language communication is also an essential element of interactive narratives in which the user is supposed to impersonate one of the story’s characters. Whilst techniques for narrative generation and agent behaviour have made significant progress in recent years, natural language processing remains a bottleneck hampering the scalability of Interactive Storytelling systems. In this paper, we introduce a novel interaction technique based solely on emotional speech recognition. It allows the user to take part in dialogue with virtual actors without any constraints on style or expressivity, by mapping the recognised emotional categories to narrative situations and virtual characters feelings. Our Interactive Storytelling system uses an emotional planner to drive characters’ behaviours. The main feature of this approach is that characters’ feelings are part of the planning domain and are at the heart of narrative representations. The emotional speech recogniser analyses the speech signal to produce a variety of features which can be used to define ad-hoc categories on which to train the system. The content of our interactive narrative is an adaptation of one chapter of the XIXth century classic novel, Madame Bovary, which is well suited to a formalisation in terms of characters’ feelings. At various stages of the narrative, the user can address the main character or respond to her, impersonating her lover. The emotional category extracted from the user utterance can be analysed in terms of the current narrative context, which includes characters’ beliefs, feelings and expectations, to produce a specific influence on the target character, which will become visible through a change in its behaviour, achieving a high level of realism for the interaction. A limited number of emotional categories is sufficient to drive the narrative across multiple courses of actions, since it comprises over thirty narrative functions. We report results from a fully implemented prototype, both in terms of proof of concept and of usability through a preliminary user study.",
author = "Cavazza, {M. O. (Marc)} and Pizzi, {D. (David)} and Charles, {F. (Fred)} and Vogt, {T. (Thurid)} and Andr{\'e}, {E. (Elisabeth)}",
note = "In most Interactive Storytelling systems, user interaction is based on natural language communication with virtual agents, either through isolated utterances or through dialogue. Natural language communication is also an essential element of interactive narratives in which the user is supposed to impersonate one of the story’s characters. Whilst techniques for narrative generation and agent behaviour have made significant progress in recent years, natural language processing remains a bottleneck hampering the scalability of Interactive Storytelling systems. In this paper, we introduce a novel interaction technique based solely on emotional speech recognition. It allows the user to take part in dialogue with virtual actors without any constraints on style or expressivity, by mapping the recognised emotional categories to narrative situations and virtual characters feelings. Our Interactive Storytelling system uses an emotional planner to drive characters’ behaviours. The main feature of this approach is that characters’ feelings are part of the planning domain and are at the heart of narrative representations. The emotional speech recogniser analyses the speech signal to produce a variety of features which can be used to define ad-hoc categories on which to train the system. The content of our interactive narrative is an adaptation of one chapter of the XIXth century classic novel, Madame Bovary, which is well suited to a formalisation in terms of characters’ feelings. At various stages of the narrative, the user can address the main character or respond to her, impersonating her lover. The emotional category extracted from the user utterance can be analysed in terms of the current narrative context, which includes characters’ beliefs, feelings and expectations, to produce a specific influence on the target character, which will become visible through a change in its behaviour, achieving a high level of realism for the interaction. A limited number of emotional categories is sufficient to drive the narrative across multiple courses of actions, since it comprises over thirty narrative functions. We report results from a fully implemented prototype, both in terms of proof of concept and of usability through a preliminary user study.",
year = "2009",
month = "5",
language = "English",
isbn = "9780981738161",
pages = "313--320",
editor = "Carles Sierra",
booktitle = "Proceedings of The 8th International Conference on Autonomous Agents and Multiagent Systems",
publisher = "International Foundation for Autonomous Agents",

}

Cavazza, MOM, Pizzi, DD, Charles, FF, Vogt, TT & André, EE 2009, Emotional input for character-based interactive storytelling. in C Sierra (ed.), Proceedings of The 8th International Conference on Autonomous Agents and Multiagent Systems. International Foundation for Autonomous Agents, pp. 313-320, 8th International Conference on Autonomous Agents and Multi Agent Systems, Budapest, Hungary, 10/05/09.

Emotional input for character-based interactive storytelling. / Cavazza, M. O. (Marc); Pizzi, D. (David); Charles, F. (Fred); Vogt, T. (Thurid); André, E. (Elisabeth).

Proceedings of The 8th International Conference on Autonomous Agents and Multiagent Systems. ed. / Carles Sierra. International Foundation for Autonomous Agents, 2009. p. 313-320.

Research output: Chapter in Book/Report/Conference proceedingConference contributionResearch

TY - GEN

T1 - Emotional input for character-based interactive storytelling

AU - Cavazza, M. O. (Marc)

AU - Pizzi, D. (David)

AU - Charles, F. (Fred)

AU - Vogt, T. (Thurid)

AU - André, E. (Elisabeth)

N1 - In most Interactive Storytelling systems, user interaction is based on natural language communication with virtual agents, either through isolated utterances or through dialogue. Natural language communication is also an essential element of interactive narratives in which the user is supposed to impersonate one of the story’s characters. Whilst techniques for narrative generation and agent behaviour have made significant progress in recent years, natural language processing remains a bottleneck hampering the scalability of Interactive Storytelling systems. In this paper, we introduce a novel interaction technique based solely on emotional speech recognition. It allows the user to take part in dialogue with virtual actors without any constraints on style or expressivity, by mapping the recognised emotional categories to narrative situations and virtual characters feelings. Our Interactive Storytelling system uses an emotional planner to drive characters’ behaviours. The main feature of this approach is that characters’ feelings are part of the planning domain and are at the heart of narrative representations. The emotional speech recogniser analyses the speech signal to produce a variety of features which can be used to define ad-hoc categories on which to train the system. The content of our interactive narrative is an adaptation of one chapter of the XIXth century classic novel, Madame Bovary, which is well suited to a formalisation in terms of characters’ feelings. At various stages of the narrative, the user can address the main character or respond to her, impersonating her lover. The emotional category extracted from the user utterance can be analysed in terms of the current narrative context, which includes characters’ beliefs, feelings and expectations, to produce a specific influence on the target character, which will become visible through a change in its behaviour, achieving a high level of realism for the interaction. A limited number of emotional categories is sufficient to drive the narrative across multiple courses of actions, since it comprises over thirty narrative functions. We report results from a fully implemented prototype, both in terms of proof of concept and of usability through a preliminary user study.

PY - 2009/5

Y1 - 2009/5

N2 - In most Interactive Storytelling systems, user interaction is based on natural language communication with virtual agents, either through isolated utterances or through dialogue. Natural language communication is also an essential element of interactive narratives in which the user is supposed to impersonate one of the story’s characters. Whilst techniques for narrative generation and agent behaviour have made significant progress in recent years, natural language processing remains a bottleneck hampering the scalability of Interactive Storytelling systems. In this paper, we introduce a novel interaction technique based solely on emotional speech recognition. It allows the user to take part in dialogue with virtual actors without any constraints on style or expressivity, by mapping the recognised emotional categories to narrative situations and virtual characters feelings. Our Interactive Storytelling system uses an emotional planner to drive characters’ behaviours. The main feature of this approach is that characters’ feelings are part of the planning domain and are at the heart of narrative representations. The emotional speech recogniser analyses the speech signal to produce a variety of features which can be used to define ad-hoc categories on which to train the system. The content of our interactive narrative is an adaptation of one chapter of the XIXth century classic novel, Madame Bovary, which is well suited to a formalisation in terms of characters’ feelings. At various stages of the narrative, the user can address the main character or respond to her, impersonating her lover. The emotional category extracted from the user utterance can be analysed in terms of the current narrative context, which includes characters’ beliefs, feelings and expectations, to produce a specific influence on the target character, which will become visible through a change in its behaviour, achieving a high level of realism for the interaction. A limited number of emotional categories is sufficient to drive the narrative across multiple courses of actions, since it comprises over thirty narrative functions. We report results from a fully implemented prototype, both in terms of proof of concept and of usability through a preliminary user study.

AB - In most Interactive Storytelling systems, user interaction is based on natural language communication with virtual agents, either through isolated utterances or through dialogue. Natural language communication is also an essential element of interactive narratives in which the user is supposed to impersonate one of the story’s characters. Whilst techniques for narrative generation and agent behaviour have made significant progress in recent years, natural language processing remains a bottleneck hampering the scalability of Interactive Storytelling systems. In this paper, we introduce a novel interaction technique based solely on emotional speech recognition. It allows the user to take part in dialogue with virtual actors without any constraints on style or expressivity, by mapping the recognised emotional categories to narrative situations and virtual characters feelings. Our Interactive Storytelling system uses an emotional planner to drive characters’ behaviours. The main feature of this approach is that characters’ feelings are part of the planning domain and are at the heart of narrative representations. The emotional speech recogniser analyses the speech signal to produce a variety of features which can be used to define ad-hoc categories on which to train the system. The content of our interactive narrative is an adaptation of one chapter of the XIXth century classic novel, Madame Bovary, which is well suited to a formalisation in terms of characters’ feelings. At various stages of the narrative, the user can address the main character or respond to her, impersonating her lover. The emotional category extracted from the user utterance can be analysed in terms of the current narrative context, which includes characters’ beliefs, feelings and expectations, to produce a specific influence on the target character, which will become visible through a change in its behaviour, achieving a high level of realism for the interaction. A limited number of emotional categories is sufficient to drive the narrative across multiple courses of actions, since it comprises over thirty narrative functions. We report results from a fully implemented prototype, both in terms of proof of concept and of usability through a preliminary user study.

M3 - Conference contribution

SN - 9780981738161

SP - 313

EP - 320

BT - Proceedings of The 8th International Conference on Autonomous Agents and Multiagent Systems

A2 - Sierra, Carles

PB - International Foundation for Autonomous Agents

ER -

Cavazza MOM, Pizzi DD, Charles FF, Vogt TT, André EE. Emotional input for character-based interactive storytelling. In Sierra C, editor, Proceedings of The 8th International Conference on Autonomous Agents and Multiagent Systems. International Foundation for Autonomous Agents. 2009. p. 313-320