Towards the end of summer 2015 I was asked by a friend, colleague and mentor, to contribute to his sound installation, that would take place at Café Oto’s Project Space, between the 10th and 13th of September 2015.

Twenty some people participated in this joint assignment and I was delighted to have been asked and be able to participate.

My contribution ended up exploring further the area of belief, albeit in a different format that the ones I am accustomed to, namely those originating from live and real-time performance.

For this event I worked without the element of improvisation, other than that required in operating a single rotary dial. I instead concentrated on the conceptual aspect and synthesized some of my thoughts in relation to gestures and their projection onto meaning/belief.

Tom Mudd’s event was named “Control” and it was set to investigate and explore the affordances of the interface design in new instruments.




Here is the link to Tom’s website:

Here is the short description of my contribution, as it appeared on the publicity for the event:

‘noControl’ investigates the apparent need of the average user to have immediate and big-featured feedback from a given interface. As patience for response diminishes, so does the ability to be open to subtlety and to wider time window evolving events. The illusion of control over the interface often accompanies this restlessness/impatience. Interactions can therefore manifest in big, dramatic and short lived movements/gestures. Under the hood of noControl, several chaotic behaviours are shaping the sound events to morph continuously and the degree of agency that the user is be able to exercise is limited and partial. Nevertheless, he/she will make belief and assign meaningful mappings between his/her actions and the sonic output.

No belief is True. No. Belief. Is. True. (Jed McKenna)

And here is how my contribution sounded. As you can notice, quick gestures do not produce a proportionate output. You really need to let it be to hear what it does…

In it, it is clearly possible to note that when the user performs abrupt and dramatic changes in the angular velocity of the knob, no sound is output.

In fact, the patch works at his intended capacity when left alone. All sound processes are internally generated by chaotic behaviors and the user has a very limited say in what the result will be. He/she can adjust slightly the angular position but then would have to wait to perceive the ever-evolving patterns that are generated.

The actual programming of noControl is quite involved and comprises of several buffers of audio over which various degrees of signal processing are applied and combined in a non-linear and dynamical fashion. This is mostly achieved by coupling oscillators and driving their frequency via stochastic processes and, partially, via user’s input. There are several constituent modules that make up the global behaviour of the patch, as already mentioned. The trigger event that initiates most of the processes in the patch is NOT determined by the user, despite what she/he might think. There are two main buffers, whose audio content is combined and treated under operation such as modulus, multiplication, summation, average and min. These buffers are loaded with new audio when triggered by internal changes. Such event is indeed resulting form the user’s input. In particular, the loading of new audio is interdicted when the angular velocity of the knob’s rotation exceeds a threshold value and, conversely, allowed if not. The audio content resulting from the interpolation of the two buffers is loaded and stored into yet another buffer, which will serve as the content for a wavetable oscillator, whose frequency and phase is interacts recursively with the chaotic coupling of two other oscillators. The actual non-linear behaviour is implemented with Gen, a welcome addition since Max 6 that allows not only to code directly in text based style (codebox) but also, and more importantly, to process with single sample accuracy, which was previously impossible.