According to Robert Rowe,  interactive music systems can be classified according to:

  • Interpretation: Score-driven vs. performance driven
  • Response: Transformative, generative, or sequenced
  • Paradigm: Instrument vs. player

‘Instrument paradigm systems are concerned with constructing an extended musical instrument: performance gestures from a human player are analyzed by the computer and guide an elaborated output exceeding normal instrumental response. Imagining such a system being played by a single performer, the musical result would be thought of as a solo.

Systems following a player paradigm try to construct an artificial player, a musical presence with a personality and behavior of its own, though it may vary in the degree to which it follows the lead of a human partner. A player paradigm system played by a single human would produce an output more like a duet.’

Reference:

Rowe, R.: Interactive music systems: machine listening and composing. MIT Press Cambridge, MA, USA, 1992.

DORY

Dory is a player-paradigm interactive music system for freely improvised music. While indebted to research in decentralised, agent-based musical systems and those employing Markov processes, Dory sets itself apart by capitalising on the artefacts introduced by suboptimal machine listening affordances and by speculating on the role of episodic and short-term memory in improvisation, realised as a second order Markov chain. Dory exhibits both reactive and learning traits, but is unable to exclusively commit to one or the other. By employing a subsumption architecture, unsophisticated machine listening techniques and probabilistic treatment of musical information, Dory appears as an engaging musical partner that exhibits seeming novelty and creativity. In this video, the second short improvisation I played with Dory during the Improvisational Creativity Workshop, Monash University, Prato, Italy, on 20.07.2017 is presented.

CONTROL

Towards the end of summer 2015 I was asked by a friend, colleague and mentor, to contribute to his sound installation, that would take place at Café Oto’s Project Space, between the 10th and 13th of September 2015.

Tom Mudd’s event was named “Control” and it was set to investigate and explore the affordances of the interface design in new instruments.

CONTROL – AN INTERACTIVE SOUND INSTALLATION: TOM MUDD

10–13 SEPTEMBER 2015, 1–9PM, OTO PROJECT SPACE

control

Here is the link to Tom’s website:

http://tommudd.co.uk/control/

Here is the short description of my contribution, as it appeared on the publicity for the event:

‘noControl’ investigates the apparent need of the average user to have immediate and big-featured feedback from a given interface. As patience for response diminishes, so does the ability to be open to subtlety and to wider time window evolving events. The illusion of control over the interface often accompanies this restlessness/impatience. Interactions can therefore manifest in big, dramatic and short lived movements/gestures. Under the hood of noControl, several chaotic behaviours are shaping the sound events to morph continuously and the degree of agency that the user is be able to exercise is limited and partial. Nevertheless, he/she will make belief and assign meaningful mappings between his/her actions and the sonic output.

No belief is True. No. Belief. Is. True. (Jed McKenna)

And here is how my contribution sounded. As you can notice, quick gestures do not produce a proportionate output. You really need to let it be to hear what it does…

In it, it is clearly possible to note that when the user performs abrupt and dramatic changes in the angular velocity of the knob, no sound is output.

In fact, the patch works at his intended capacity when left alone. All sound processes are internally generated by chaotic behaviors and the user has a very limited say in what the result will be. He/she can adjust slightly the angular position but then would have to wait to perceive the ever-evolving patterns that are generated.

 

Advertisements