Transforming light intensity into sounds – light.void~ in hypothetical particles - Anne Liao Zouning

What if music dimensions could be controlled by changes in light intensity? What if flashlights could transform the natural soundscape of a thunderclap into a chaotic synthesizer, and turn pointillistic raindrops into harmonious chords? This presentation will encompass a six-minute performance of the piece hypothetical particles, brief discussion of the light-dependent digital musical instrument light.void~, action-sound reactivity, data-to-sound mapping in my work.

Presented by: Anne Liao Zouning
Biography

Hypothetical particles in physics refer to particles that have not yet been observed and proven to exist. However, these particles are necessary for maintaining consistency within a given physical theory. In this composition, I explore this phenomenon by examining the interaction between particles of light and sound. The amplitudes of the lights trigger changes in the music, revealing connections between the natural and synthetic realms of sound.

 

To facilitate this exploration, I created a digital photo-controller modeled after the light.void~ designed by Felipe Tovar-Henao, who is a current Postdoctoral Fellow in Music Composition at University of Cincinnati’s College-Conservatory of Music in the United States. His iteration of light.void~ was acknowledged as an 'inferred replica' of Leafcutter John's light thing.

 

Light.void~ is a custom-made digital photo-controller utilizing 16 light-dependent resistors and operating through the Arduino MEGA 2560 microcontroller board. Each sensor relays 10-bit data values to Max/MSP, where they are converted into floating-point numbers, and scaled within the 0 to 1 range. The transmitted data reflects the intensity of light detected by the sensors, with higher values corresponding to greater light intensity. This device serves both as a receiver and transmitter of data linked to the observed light intensity.

 

Utilizing 16 individual data streams, I mapped them to various sound rendering techniques, including granulation, comb filtering, and distortions. I assigned a light-dependent resistor to advance the music into the next section, similar to progressing cues with a MIDI pedal. Different gestures reflect changes in light intensity, which correspond to shifts in sound. As an illustration, bringing the flashlight closer to the light.void~ results in a higher density of granulation.

Back to the event