Imagine manipulating a three dimensional animation of a dataset—stretching it with a gesture, tilting and twisting it, seeing it from all sides, erasing unwanted information, and foregrounding particular bits of data. Imagine further—you can simultaneously hear the data transformation in three-dimensional audio. In this imagination, data is immersive, offering information concurrently to all of your senses—visual, aural, and haptic. This experience of data, drawing on interactive multidimensional graphics and sound, would be able to offer the possibility of spontaneous leaps in cognition, emphasizing new associations and understandings, and drawing attention to previously invisible relationships. This is the ultimate goal of the Data Sensorium.
The Data Sensorium project is a collaboration between Stony Brook University and Brookhaven National Laboratory (BNL), focused on the development, evaluation and implementation of multimodal visual and auditory interfaces for the analysis of large scale data sets. Researchers at both institutions are already engaged in both visualization and sonification of data. But this work is inherently multifaceted: it requires research in perception and cognition as well as the development of complex tools for delivering sonification and visualization through multi- modal display environments. Making data legible sensorially requires the integration of concepts from engineering, computer science, psychology, neurobiology, acoustics, music, design and the arts. This collaboration and conceptual integration is at the heart of the Data Sensorium.
Visualization has been extraordinarily helpful in allowing scientists to comprehend large scale data, but combining this with the potential of sonic as well as haptic/ tactile interfaces has proven to be even more suitable for allowing individuals to grapple with large scale data analysis. Anecdotal evidence identifies a strong correlation between the increased ability to process multiple tracks of information with the aid of additional sonic (as well as visual and haptic) input. A growing body of research supports the conclusion that sonification of data increases the communication bandwidth at the human-computer interaction, allowing individuals to process the ever-larger sets of data that research is currently generating without entering into information overload.
We have brought together people who are working with large-scale data with people who are experts in visualization, sound, and movement in the arts. The combined expertise leverages different viewpoints to more clearly parse patterns utilizing a wider range of sensory capability.