LiveStream.220306

LiveStream.220306

We are now starting a series of live streams on twitch.tv with our project ‚Across the Valley – Über das Tal‘ together with Marc Sloan (Easton, Pennsylvania).
The streams will have different focuses and also different durations, some maybe only 15 minutes, others several hours.

We begin on Sunday, March 6 on the topic:
Orienting ourselves together in space while we are distant from each other.

This stream will be relatively short, so please come on time. Or, if you can’t make it live, you can find a recording of it afterwards on our website.

March 6, 2022 12:30 EST – 17:30 UTC – 18:30 CET
https://twitch.tv/blackholefactory

Sound is very important in this project, so please use good headphones or a sound system!

Recording of the LiveStream @twitch.tv:

Across the Valley – tree to cymbal

Test to transform movements of a branch from a distant tree into sounds on a cymbal. Vibration sensors are attached to the tree and connected to a raspberry pi (solar powered).  The data is sent over mobile internet to the studio where a Pd-patch analyses the data stream and send it to an arduino which controls a servo motor.

 

The tree:

bird_12-9-7b

bird_12-9-7b

      bird_12-9-7b - blackhole-factory

recorded in the installation at Haus der Braunschweigischen Stiftungen – im Garten in the range of sensor units 12, 9 and 7
Each of these three sensor units is equipped with two piezo sensors to detect the movements of the branches of the plants. Unit #7 is in a tree, unit #9 on a bush and unit #12 on a hedge. The sensor data is sent via a radio network to the central computer, where it is converted into the control data for the sound synthesis of these three locations. Via Wi-Fi, the data is sent to the user’s smartphone at this location and, together with the smartphone’s motion sensors, controls the sound generated in the app.

The Agents Play Their Music

DisplayAOSCWeb

Bringing the A.O.S.C. project to the next level – the agents play their music!

In the previous setup the agents functioned only as the ears of the system, listening to the acoustic environment and sending the data of the audio spectrum to the main server. Everything else, the analysis of the data streams, the resulting calculation of the parameters for the sound synthesis as well as playing back the music, all that happened on the server side.

Now we bring the music back to the agents!

The server only does the calculations and sends the stream of parameters back to the agents where the music is generated by a Pd patch.
That means, you can sit beside the agent, listen to your actual acoustic environment and – by using headphones – to the collective electronic soundscape composed by all the networked agents.
Now this system offers the possibility to directly observe the impact of the acoustic events at the agent’s place as well as direct acustic interaction with the system.

In the moment we have four agents running: Toronto, Lviv, Sydney and Bonaforth.

Currently we are working on the algorithm which is receiveing the data stream and calculates the control data for the music.

Here are some snippets.

aosc170731 is a YouTube video with syncronized visualisation of the data streams.

Dance_Code

 

Dance_Code
Photo: blackhole-factory

Dance_Code is a project by the dancer Agnetha Jaunich (Kassel) in collaboration with blackhole-factory, exploring the possibilities of transforming movement and sound into 3d graphics in an improvisation.

The movements of the dancer are tracked by a kinect sensor and mapped to the position and shape of the 3d object. The frequency spectrum and amplitude of the sound are changing the texture and distortion of the object.

Agnetha Jaunich – dance
Elke Utermöhlen – voice
Martin Slawig – percussion + live processing, programming

The project is supported by Kulturinstitut der Stadt Braunschweig

Watch an excerpt on Vimeo: