︎

Index ︎    Random ︎


AudioVisual Experiments: One. 


The first in a series of audiovisual explorations. Both audio and video are original works created in unison to explore algorithmic modes of making.


Both visuals and audio are being automated, at times randomly, through a specified set of numeric values. The result is a triggered interaction of graphic vector images by the audio input.

This project was initiated as part of Sonic Jeel as a way to test new methods of audio production using hardware and software. Currently, as part of my design practice, I am exploring sound as a catalyst for new work, while at the same time applying interests in automated digital processes and generative shape making.
Some of the audio was produced using OP-Z by Teenage Engineering and Ableton Live. The animations were programmed using VideoLab in Unity. VideoLab is a patch-based interface to allow node-based coding with prebuilt scripting modules. VideoLab allows users to create basic graphic images and animated those images by linking to specific audio parameters using MIDI input. My code was based on work by Keijiro Takahashi, who provided developer projects for Unity and Teenage Engineering.

Other audio was produced using softPop organic analog mini-synth by Bastl instruments. The visuals were generated using Quartz Composer from Apple’s Xcode developers kit. QC is a node-based programming environment similar to MaxJitter, that allows for basic generative effects triggered by audio inputs. Audio is relayed internally with LoopBack software and Focusrite USB hardware interface.



Experiment using VideoLab Klak patch in Unity with an OPZ mini-synth

Experiment using VideoLab Klak patch in Unity with an OPZ mini-synth

Experiment with Quartz Composer using SoftPop analog synth

VideoLab interface in Unity