Abstract
In this paper, we describe a novel method of combining emotional input and an Augmented Reality (AR) tracking/display system to produce dynamic interactive art that responds to the perceived emotional content of viewer reactions and interactions. As part of the CALLAS project, our aim is to explore multimodal interaction in an Arts and Entertainment context. The approach we describe has been implemented as part of a prototype “showcase” in collaboration with a digital artist designed to demonstrate how affective input from the audience of an interactive art installation can be used to enhance and enrich the aesthetic experience of the artistic work. We propose an affective model for combining emotionally-loaded participant input with aesthetic interpretations of interaction, together with a mapping which controls properties of dynamically generated digital art.
Original language | English |
---|---|
Number of pages | 6 |
Publication status | Published - 2007 |
Event | International Symposium of Mixed and Augmented Reality 2007 - Nara, Japan Duration: 13 Nov 2007 → 16 Nov 2007 |
Conference
Conference | International Symposium of Mixed and Augmented Reality 2007 |
---|---|
Abbreviated title | ISMAR 2007 |
Country/Territory | Japan |
City | Nara |
Period | 13/11/07 → 16/11/07 |