An emotionally responsive AR art installation

Stephen Gilroy, Marc Cavazza, Rémi Chaignon, Satu-Marja Mäkelä, Markus Niiranen, Elisabeth André, Thurid Vogt, Mark Billinghurst, Hartmut Seichter, Maurice Benayoun

Research output: Contribution to conferencePaperpeer-review

206 Downloads (Pure)

Abstract

In this paper, we describe a novel method of combining emotional input and an Augmented Reality (AR) tracking/display system to produce dynamic interactive art that responds to the perceived emotional content of viewer reactions and interactions. As part of the CALLAS project, our aim is to explore multimodal interaction in an Arts and Entertainment context. The approach we describe has been implemented as part of a prototype “showcase” in collaboration with a digital artist designed to demonstrate how affective input from the audience of an interactive art installation can be used to enhance and enrich the aesthetic experience of the artistic work. We propose an affective model for combining emotionally-loaded participant input with aesthetic interpretations of interaction, together with a mapping which controls properties of dynamically generated digital art.
Original languageEnglish
Number of pages6
Publication statusPublished - 2007
EventInternational Symposium of Mixed and Augmented Reality 2007 - Nara, Japan
Duration: 13 Nov 200716 Nov 2007

Conference

ConferenceInternational Symposium of Mixed and Augmented Reality 2007
Abbreviated titleISMAR 2007
Country/TerritoryJapan
CityNara
Period13/11/0716/11/07

Bibliographical note

ACM allows authors' version of their own ACM-copyrighted work on their personal server or on servers belonging to their employers.

Fingerprint

Dive into the research topics of 'An emotionally responsive AR art installation'. Together they form a unique fingerprint.

Cite this