I did the music and sound design for Volatilecycle’s motion graphics showreel… —One Year In Nottingham Showreel from Richard Thompson on Vimeo.
Here’s a remix i did for I Am Jupiter, from their upcoming EP ‘Trucks’:
For a while now I’ve known i wanted to be able to use data to generate interesting music procedurally, or algorithmically. Although not a new idea, i like the possibility that the control for the algorithms could come from an image- or some other realtime data, actually – and that the music would be related somehow to the data but would be a piece in it’s own right. Although there
I’ve spent an awful long time working out seemingly trivial technical points with this project so far, but now i feel like I’m reasonably fluent in using Puredata and Openframeworks, and have created a reasonably flexible development setup using oF, Pd and Logic Pro as a temporary sound source. Although the ultimate goal is to integrate everything into a single application, having the component parts separated out like this means
I’ve made good progress in understanding what i can do with OpenFrameworks, how i can get it to interact with PureData and what the limitations of it are. Learning OpenFrameworks has proved challenging – although there are plenty of ‘getting started’ tutorials which make the basics look easy, there’s a real jump up to the more complicated stuff that requires a good depth of understanding of the world of C++.
I’m making slow and steady technical progress through the project – i’ve started to write simple applications based on OpenFrameworks using xcode. I’m going to do the graphical stuff with this, ‘wrapping’ a Puredata patch into the application. PD will deal with the sequencing and, ultimately, the sound generation too. Here’s the results of an OF application that analyses an image and graphs the average hue, saturation and brightness for
Posting some pictures from last year’s ‘colourorgan’ project – here a yellow chair is offered up, the software presenting a corresponding yellow ‘matrix’ on screen (that was used to drive an audio sequence) and an image tagged as ‘yellow’ from a Google image search: Here’s a couple from the IOCT showcase – the software in action again: And a rather fetching picture of me explaining something unwittingly sporting a large
My original proposal for my Master’s final project went along the lines of: “Exploring the world through music: an investigation of data sonification (and related techniques) in computer generated music and audio mixing.” I’d done a bit of work in the past around the idea that you could analyse an image and have some sort of music generated. The ‘colourorgan’ used colour information to generate / play specific sounds and sequences,
Last year i made a patcher called the ‘Flickr Playr’ that looks at your Flickr photo stream, downloads the images and creates ambient music based on the image. I’ve finally fixed all the problems with it and created a nice simple interface - you can enter your flickr ID and generate some nice music! Here’s a little video demo. When i get round to it i’ll post the app here
The date for the IOCT showcase looms – i have just over a week to decide exactly what i want to put on, and prepare the practical stuff (as well as doing the writeup for the colour organ piece). My original plan was to add a control system to the organ, building a reactable style control table. While i still want to build this, given the time constraints i don’t