‘Living Instruments’ was produced jointly by We Spoke New Music ensemble and three members of the Hackuarium community.
With ‘Living Instruments’, performers stimulate pieces of nature like microorganisms to generate sounds, for example by applying heat and movement, watching paramecia through a microscope, touching moss or playing with the proximity of slightly radioactive everyday objects. Sensors are placed at several locations to measure activity of the organisms and objects. The recorded data is then turned into sound through synthesizers. The nature of these instruments allows the performers to en gage with them very interactively and the cyclic behaviour a of the living pieces is reflected musically with rich grooves and rhythmic patterns.
Enlarged video footage of the movement of fermentation bubbles, paramecia, moss, visualised traces of radioactivity and other curiosities of nature are displayed on a large screen behind performers, highlighting the common dynamics between the scien tific process and the musical outcome.
‘Living Instruments’ reveals a hidden world, encourages interdisciplinary thinking and presents biology, electronics and their relation to sounds as an ethereal and fun experience. The performers slip into the roles of experimental scientists forming a hybrid setup where the interaction between humans and other living matter generates music.
Bubble Organ
The Bubble Organ is the result of the first ideas which started the whole collaboration and is the main instrument powered by fermenting yeast cultures. As liquid yeast solutions are fed with sugar and agitated by stirrers, carbon dioxide is produced from several flasks. The gas flows through several trans parent tubes and forms bubbles while passing through certain segments filled with blue-coloured water. Ten photosensors measure the bubbles and two pressure sensors measure the accumulated gas pressure in separate, long vertical tubes. All sensor signals are collected with two Arduino boards and sent to a computer via serial connections. A MaxMSP patch analyses the data, maps them to musical notes which trigger an analog synthesizer and adds sound effects to the overall output and additional microphone inputs recoding the bubbling action of some of the tubes. Performers play the instrument by gating individual sensor signals using a MIDI keyboard, feeding the yeast with sugar and by adjusting the speed of each stirrer. The development of this and the other instruments is documented online and the Arduino code is released as open source. For more information please read the wikipedia page or check out the GitHub page.
iPadPix
Inspired by cloud chambers, iPadPix allows intuitive exploration of natural and other sources of low radioactivity by means of augmented reality. Different particle types are distinguished by evaluating their interaction with a pixel detector. Recorded traces of radiation are displayed on top of the live video feed from a tablet’s camera and trigger percussion sounds of an analog synthesizer. The mobility of iPadPix enables artistic as well educational activities to observe radioactivity from everyday objects and the environment over time and space. For more information, see the article by O. Keller et al., “iPadPix – A novel educational tool to visualise radioactivity measured by a hybrid pixel detector” or this GitHub page.
Mossphone
A moss – human interface conceived back in 2013, developed further as an interdisciplinary research about natural bioreporters research project on indicator species, such as moss and lichens, through which we can sense the environment and the climate changes. Mosses are tiny organisms, the first plants emerged from the ocean to conquer the land; unique and delicate native species with slow temporalities of growth which can decipher the secrets of life on Earth. They are pioneer species that can live in very harsh conditions, which can also provide important microhabitats for an extraordinary variety of organisms and plants. The interaction with moss phone has a ludic sound feedback that changes depending on the human that is in contact. When is touched, the human becomes, in a way, part of the same element changing its resistance (capacitive sensor) and this changes are data that is captured to be converted into sound. The result is the illusion that the moss is alive and reacts to our way of touching it, like if it were singing, snarling, murmuring or growling. Technical information: electronic circuit with an active capacitive sensor or antenna; the organic interactive object invites to be touched and reacts to the arousal of physical contact emitting real-time sinusoidal sound feedback through a serial communication with Pure Data or MaxMSP.
Virtual Soprano
A facial detection program (that is often used for surveillance purposes) is used to detect human facial expression to mimic a soprano singer. The algorithm turns specific spots of the human performer’s face it into vocal sounds using a modified version of FaceOsc by Kyle McDonald run in Open Frameworks. Like a virtual soprano, a framed version of the user’s face is screened in front of the public enabling the virtual soprano do her performance.
Paramecia Controller
The paramecia controller applies 5 volts electrical tension on a mini-swiming pool (1 cm square) for unicells (0.1mm square) which makes them move from one side to the other on command, using a joystick. Their activity is magnified on the screen and the sound design follows their movements. All unicells are returned to a nearby canal after the performance, unharmed.
Paramecia movement tracker
A video recognition program tracks the free movements of paramecia under a microscope and triggers samples each time they enter a specific zone, creating a fugue of samples. When magnified 100 times, the paramecia seem to be dancing on the screen, providing beautiful visuals to complement the music.