Concept -------

EllipSoundVis 1 and 2 are two sound visualisation systems built with Processing. They are based on different combinations of ellipses that change accordingly to the music played. They both display a similar visual language with a black background and shapes represented by thin lines. The visuals can be manipulated and customised by the user with sliders.

Interface and interaction modality ----------------------------------

Both sound visualisations generate a visual and auditory experience that can be enjoyed on any device – with a screen and the possibility to play sound – enabled to run a Java-based program. They can either perform autonomously or can be controlled by the user, who can modify the visual output with a number of sliders that allow to influence in real time different parameters such as the colour, the transparency and the thickness of lines, as well as several parameters that are specific to each visualisation.

Technology ----------

The sound visualisations are build in Processing 1.5 and use the OpenGL library to enhance the graphics, the Minim library for the sound and the contributed controlP5 library by Andreas Schlegel for the sliders. Both visualisations are based on ellipses that change in accordance to the song in MP3 format that is loaded. The background is black in order to provide the highest contrast and to be suitable for dark environments. In EllipSoundVis 1 the ellipses are superposed and follow the vibration of the sound, creating the sense of a sort of stringed musical instrument playing, while in EllipSoundVis 2 the ellipses are nested into each other reminding of a spinning vinyl record that can also be reduced to vertical or horizontal lines reacting to sound.

User experience ---------------

Both sound visualisations create an immersive and yet subtle visual and auditory experience that can be enjoyed by a broad audience. They can be experienced alone or within a large group of persons, depending on the size of the screen and the modalities of the sound diffusion.

The user can choose to play the role of the viewer and listener or to customise the experience by manipulating the graphic output in real time. The sliders are very simple and intuitive to use and allow to create an immediate and direct interaction with the visuals.

Research and development contexts ---------------------------------

The context of use is very broad and not limited to a specific situation, although a dark environment with good acoustic would enhance the experience. The sound visualisations could be developed as applications for the desktop or for mobile devices, nevertheless the best experience would be provided by a large projection and a powerful sound system in a large, dark room. Further developments should implement the possibility for the user to easily load the desired sound.

References ----------

Reas C., Fry B., (2007) Processing: A Programming Handbook for Visual Designers and Artists