Programming their own software tools allows students to achieve musical goals which are well beyond the reach of conventional off-the-shelf software. Recent projects include algorithmic work, live electronics, interactive systems, spatialisation tools and personalised sound generation and processing environments.
Programming techniques are taught using Max/MSP, a highly versatile graphical programming environment specifically designed for musicians. Much of the interactive installation work presented on this site was programmed using Max/MSP.
JAMES EADE, Birdsong (2015)
Generative data sonification within a MaxMSP coding environment. Inspired by Brian Eno's work on generative process and the Listen to Wikipedia project by Steven LaPorte and Mahmoud Hashemi, Birdsong cycles through Tweets containing a user-specified hashtag. The piece is continuous and open in form, with no specified beginning or end. Intended to represent a worldwide ensemble of voices, Birdsong produces a musical interpretation of Twitter that differs each time it is played.
JOSH TWILLEY, Score (2012)
Software for instantaneous audio manipulation and transformation. Score aims to facilitate and encourage musical exploration by anyone and can be operated without any prior knowledge of musical terminology, theory or complex digital tools. It consists of an interface made up of a simple 5 by 5 sliding puzzle. Each of the 24 puzzle pieces is used to represent an audio sample. And each of the 25 grid positions is connected with one of four audio manipulation patches
When a piece of the puzzle is selected, the relevant manipuolation patch is applied to the selected audio file generating highly varied and interesting new sounds.
JAMES JOSLIN, Coupes la Parole (2012)
Toy piano, music box and live electronics in the MaxMSP coding environment. Coupes la Parole, translated as 'Interruptions', is informed by ideas on non sequitur and based on mobile form works such as Pierre Boulez's Third Piano Sonata. Six movements make up the work, each of which is triggered randomly by a sound effect not based on anything previous to it.
SAM DODSON, Swatch (2009)
A third year Final Project: 'Swatch is a custom software-based instrument - intended for a live performance setting - that enables the user to import, loop, and manipulate pre-recorded sound, consequently outputting material of a glitch aesthetic.' Part of the motivation for this project was to explore the importance of interface design in software-based instruments.
STEVE MORGAN, Project 3 (2009)
A second year programming project, this is a sound development environment with video component for live performance.
ROB MALONE, Project 3 (2008)
This patch was designed to enable improvisation and musical 'play' in a live context by combining live input with pre-recorded loops in an easy-to-control interface. The patch permits variable loop lengths (as loops get shorter, they create a granulation effect as can be heard here) and includes a voice-controlled synth.
SEBASTIAN MOODY, Project Three (2008)
This patch was designed to be a compositional playback device that would allow certain musical variables to be changed in real time. In other words, the overall musical thread stays the same but certain details may be different from performance to performance. The patch combines samples (voice, long violin notes and the amen drumbreak), synthesis and processing (e.g. pitch shifting, delay and spectral effects) to achieve a rich variety of timbres.
SAM DODSON, Circulator (2008)
This is a sample-based live-performance patch designed to generate ambient soundscapes, getting quick meaningful results from any soundfile that is loaded into it. It is arranged in 'chunks' of code, offering control over volume, pan, pitch, amplitude modulation, bit/sample resolution, delay and filters, and has a variety of automated controls to ensure that the performer need not be attempting to control everything at the same time. The interface for the patch was considered carefully to ensure that it was a welcoming environment and to enable easy navigation.
MIKE FOYLE, 7am (2007)
The entirety of the sound synthesis and composition for this piece is done within Max/MSP. Mike says 'I decided to try and create an idm-esque production with a very obvious sense of progression and development and also tried hard to manipulate synthesised sounds as much as possible while retaining musical integrity, rhythm and structure.'