About this project


Urban citizens of Montreal enjoy a great wealth of artistic creations in public space; amongst them we find a great collection of sculptures. These creations visually enrich our daily lives, for example, while walking through an urban park. While the visually impaired community engages strongly with their audible environment, artworks such as sculptures generally remain inaccessible. Motivated by observations during our recent field study with the blind community within the In Situ Audio Services (ISAS) project, we are intrigued by the possibility of rendering for them an acoustic description of physical art forms as an analogue to the experience that sighted individuals receive as they walk down the street. As a concrete example, we wanted to convey through auditory signals the physical features of one of the many sculptures up in the Mont-Royal mountain, so that a blind individual gains a mental representation of its form, and ideally, an appreciation of its aesthetic.

Inspired by early work on sonification of shapes and textures [1, 2, 3], and psychomechanics research of simulated sound sources related to the intuition of object sizes [4], we proposed to address the question above. In order to develop a sonic representation of the sculptures that is useful and interesting for the blind, we made use of an interaction paradigm that is closely related to echolocation. This mode of gathering information about the surroundings, is familiar to many blind individuals, and is based on sending out short impulses and listening to the reflections of nearby objects. In brief the audible sculptures listen for your sounds like clapping hands, snapping fingers or tongue clicking and render back these short impulses convolved with a soundbuffer approximately 3 seconds in length. This convolution kernel corresponds to the actual auditory representation of the 3D sculpture and is inspired by the model-based sonification method known as a data-sonogram [5]. In this data-sonogram, the information is mapped to spatial and spectral parameters and change its characteristics depending on the perspective and distance from listener with respect to the sculpture.

Our methodology for the selection of the sound synthesis and mapping strategy followed an ethnographic and participatory design approach. To this end, we developed a prototype that allows an individual to listen to the sonic responses of the short impulse sounds. Our visually impaired team member, Mike Ciarciello,  helped us to evaluate and design this sonic experience through the exploration of the sculpture by touch and together we explored what helps to create an audible mental representation of the sculpture. As a next step, we run a small pilot study to evaluate the sound synthesis approach with more participants following a similar approach as used by Jung-Kyong and Zatorre [3]. Their protocol involved training individuals to recognize tactile spatial information using sounds mapped from abstract shapes. After training, participants were presented with novel shapes, and reported their capacity to deduce a mental model from the audible representation. 

We are hoping to evolve the research described above into an art installation.  This project was funded by the Strategic Innovation Fund award from the Centre for Interdisciplinary Research in Music Media and Technology (CIRMMT). The project was developed at CIRMMT and the Shared Reality Lab

[1] Meijer, P.B.L. "An experimental system for auditory image representations," Biomedical Engineering, IEEE Transactions on, vol.39, no.2, pp.112-121, 1992

[2] Seung-Yeo, W. and Berger, J. "Raster Scanning: A New Approach to Image Sonification, Sound Visualization, Sound Analysis And Synthesis," Proceedings of the International Computer Music Conference, New Orleans, LA, U.S.A., 2006

[3] Jung-Kyong, K. and Zatorre, R. "Can you hear shapes you touch?," Experimental Brain Research, Springer Berlin / Heidelberg, pp. 747-754, 2010

[4] McAdams S, Chaigne A, Roussarie V. "The psychomechanics of simulated sound sources: material properties of impacted bars", J Acoust Soc Am., 115(3), pp. 1306-1320, 2004

[5] T. Hermann and H. Ritter, “Listen to your data: Model-based sonification for data analysis,” in Advances in intelligent computing and multimedia systems (G. E. Lasker, ed.), (Baden-Baden, Germany), pp. 189– 194, Int. Inst. for Advanced Studies in System research and cybernetics, 08 1999.

A project by:

  • Cooperstock, J. R. – Main advisor
  • Grond, F. - Sonic interaction design and project concept
  • Olmos, A. - Design research and project concept
  • Piche, J.; Scavone, G.; Settel, Z. - Advisors
  • Winters, M. - Support for signal processing