Planet Gear

The overall concept of this video revolves around images of imaginary astronomical phenomena. I selected a method which I anticipated would be appropriate to the construction of 'scaleless' objects, such that one could imagine them occupying planetary sized volumes of space.

The content of this video was made with a simple video synthesizer program. I initially wrote this program as a way of generating visual accompaniment to my live performances. It is consists of a pattern generator which uses a sonic input for real-time pattern modification. In the context of a live performance, this input would be some or all of the sound coming from my on-stage equipment. In the context of this video, the modifying sonic input came from splitting the overall sound signal of "Planet Gear" into separate instrumental elements and picking out passages according to their suitability. I say this because the sonic character of a passage has a direct bearing on whether or not it is suitable for translation into imagery.

The first section of the video synthesizer is made up of six input processors. These can be thought of as modules which generate a modulation signal which can then be used to modify the visual patterns. An input processor can be 'tuned' such that it 'listens' to a particular frequency range in the source audio material. As an example, it could be tuned such that it only registers information in the low frequency range. As such, it will 'listen' to instruments such as bass drums or bass guitars, register the overall amplitude of the signal in that range and turn it into a form that can be used to modulate the visual parameters. Any instrument which makes sound not in that part of the frequency range will not be registered and thus will not contribute to a modulation signal from that input processor.

The second section is made up of function generators which control the various visual parameters. These are: X and Y position; X and Y size; Red, Green and Blue components; Alpha channel. The function generators thus control the location, dimensions, colour and transparency of the on-screen objects. To a degree, they resemble a bank of oscillators in an analogue synthesizer. As such, each function is made up of various sine, sawtooth, pulse and noise waveform generators. Each waveform has independent controls for amplitude and frequency. In addition to that, low-pass filtering and sample-rate conversion are available for modification of the overall composite waveform. The resulting output is then mapped to whichever visual parameter is being controlled by that function generator.

The function generators are able to accept a modulation input signal from the input processors. In so doing, the functions can be modified by real-time sonic input. Assume, as in the example, that a particular input processor is set up such that it registers only bass signals. It is by selecting this input processor as a modulation source in a particular function generator that it is possible to vary a given visual parameter with bass information. The function may be amplitude- or frequency-modulated by the bass. So, in that way it is possible to alter the dimensions of an on-screen object with the bass -by amplitude-modulating the functions pertaining to X and Y size. Likewise its location with X and Y position, and so on.

As there are six input processors, I typically tune them such that each one registers a specific part of the frequency range. For example, if I am using a recording of some drums, one processor can be tuned such that it registers the bass drum, the next to register the snare drum, another the hi-hat, another the ride cymbal and so on. In this way a whole drum kit can become available as a separated set of modulation signals with which the various on-screen objects can be modified. Each function generator can be configured to accept any one of the six input processors separately for amplitude and frequency modulation.

The third section controls the type and amount of objects on-screen and how quickly they are updated and re-drawn.

To construct the video, the music was broken into sections, typically of eight or sixteen bars in length. I then selected a pertinent set of instruments for each section. I say pertinent in order to convey what I mentioned earlier; that as much as an instrument may sound good, it might not make interesting patterns! For each eight or sixteen bar section, the input processors were configured such that they supplied usable modulation information for the function generators. The function generators were then configured to make a basic image in keeping with the "imaginary astronomical phenomena" brief, ready for subsequent sonic modification (throughout many of the sections, I morphed from one configuration to another in order to generate an image that continually changes in character.)

The main part of the project was then to investigate ways in which the functions could be sonically modulated such that the visual modifications began to approximate my imaginary astronomical inspiration, and of course also began to tie in to the accompanying musical content. This is for me where it gets really interesting as it is where one can begin to explore the staggering range of possible connections between the visual and aural spheres.



'Just A Souvenir' out now
CD or LP from Warpmart, Amazon
Download from Bleep, iTunes

Enter Site

Visit Squarepusher MySpace




Remote Location

Warp Records

Live Shows


NOVEMBER
19 France, Paris @ Trabendo
21 Austria, Graz @ Non Stop Cinema
22 Hungary, Budapest @ Diesel
24 Poland, Katowice @ Galeria Szyb Wilson
26 Germany, Berlin @ Volksbuhne Theatre
28 Holland, Amsterdam @ Melkweg
30 Belgium, Ghent @ Vooruit

DECEMBER
6 & 7 UK, Minehead @ ATP Festival
09 UK, London Astoria (w/ LFO)
11 UK, Glasgow @ ABC (w/ Nathan Fake)
12 UK, Manchester @ The Warehouse (w/ LFO & Luke Vibert & More) (SOLD OUT)

Warp Subscribe


Enter your email address below to subscribe to the Warp mailing list, you can unsubscribe on every email we send.




Unsubscribe | View Newsletters