Proceedings: GI 2005

Exploring non-speech auditory feedback at an interactive multi-user tabletop

Mark Hancock , Chia Shen , Clifton Forlines , Kathy Ryall

Proceedings of Graphics Interface 2005: Victoria, British Columbia, Canada, 9 - 11 May 2005, 41-50

DOI 10.20380/GI2005.06

  • Bibtex

    @inproceedings{Hancock:2005:10.20380/GI2005.06,
    author = {Hancock, Mark and Shen, Chia and Forlines, Clifton and Ryall, Kathy},
    title = {Exploring non-speech auditory feedback at an interactive multi-user tabletop},
    booktitle = {Proceedings of Graphics Interface 2005},
    series = {GI 2005},
    year = {2005},
    issn = {0713-5424},
    isbn = {1-56881-265-5},
    location = {Victoria, British Columbia, Canada},
    pages = {41--50},
    numpages = {10},
    doi = {10.20380/GI2005.06},
    publisher = {Canadian Human-Computer Communications Society},
    address = {School of Computer Science, University of Waterloo, Waterloo, Ontario, Canada},
    }

Abstract

We present two experiments on the use of non-speech audio at an interactive multi-touch, multi-user tabletop display. We first investigate the use of two categories of reactive auditory feedback: affirmative sounds that confirm user actions and negative sounds that indicate errors. Our results show that affirmative auditory feedback may improve one's awareness of group activity at the expense of one's awareness of his or her own activity. Negative auditory feedback may also improve group awareness, but simultaneously increase the perception of errors for both the group and the individual. In our second experiment, we compare two methods of associating sounds to individuals in a co-located environment. Specifically, we compare localized sound, where each user has his or her own speaker, to coded sound, where users share one speaker, but the waveform of the sounds are varied so that a different sound is played for each user. Results of this experiment reinforce the presence of tension between group awareness and individual focus found in the first experiment. User feedback suggests that users are more easily able to identify who caused a sound when either localized or coded sound is used, but that they are also more able to focus on their individual work. Our experiments show that, in general, auditory feedback can be used in co-located collaborative applications to support either individual work or group awareness, but not both simultaneously, depending on how it is presented.