This is a time lapse of the tower block beside my house. Frames were shot every 10 seconds from 9am until about 9pm. The edge effects on the left and right edges are continuation of the very edge pixel values left and right to make up a full HD resolution frame (1920x1080 pixels). This is similar to the technique used in the 1D moovies below. Music is No More Time by Cyriak. I also made some time slice images from these photographs (without the edge effects). I've arranged them together here. In each still image, as you move from left to right, from right to left, from top to bottom or from bottom to top, you are travelling from morning to night. The stripy nature of the result is an indication of the variability of the lighting conditions throughout the day. These variations come from the motion of the sun across the sky (slow) and changes in the percentage of cloud cover (fast). Also, the black line which looks like a random gaussian process comes from slight movements in the telephone cable. The magnitude of deviations from the mean is an indication of how the wind speed changed throughout the day.
Katy agreed to sit still for a few minutes and be filmed. I took the footage and rearranged it like this. First, I made a photoquilt from the frames of a video clip (something like this). Then I created a computer simulation of sliding a camera along a sheet of ice over the videoquilt, shooting at normal video frame rate. In the simulation, you can nudge and pull the camera using the mouse and keyboard. You can also zoom in and zoom out. If you slide the camera in a straight line, at the correct speed, you get something which looks like normal video. If the camera curves or goes at a different speed, you get strange frame-effects. When the camera hits the edge of the image, it bounces, like on a pool table. It's just a way of making motion again out of the still images. A couple of minutes of playing around with the sliding camera on screen, and many days of rendering later, you get this video. I need a 64-bit machine to speed up the rendering.
This was shot with Brian Liddy, using his camera on a sunny day in Galway. This was the first piece of HD footage I managed to get my hands on, and I ran it through several other manipulations, basically to see how much more CPU time my algorithms require for handling HD. This one came out the best I think. This technique uses the same idea behind the Technicolour Stringly Glitch video I made with Mat Fleming. You can see that video below also. Instead of simply splitting the three RGB channels and offsetting each of them with a time delay, I have split the sum of the RGB values into more than three parts, equally spaced in the colour spectrum, and separated those with a time-delay instead. It's a bit tricky really. It was very nice of Sinead to dance for us. I now think this is a lot better without the sound! So perhaps you should turn it off.
Mat Fleming shot this with Christo and Illana. He had been explaining to me some cunning techniques for getting technicolour effects by putting red green and blue filters over camera lenses and then projecting the three final images simultaneously. I guess this would work with black and white film if you put some filters on the projector as well. Or something. Anyway, we messed around a bit with doing this with still photographs, digitally and with filters. Finally, I wrote some code that does this manipulation to every frame in digital video. Only of course this footage was originally 8mm film. I think it looks great, better than the HD video above, largely because of the gorgeous flashes of colour caused by defects in the original film footage. I'd like to make some more things like this.
I made these two videos with James Johnson-Perkins for a residency we did in Summer 2007 for rednile in Sunderland. They were made from clips from 2001: A Space Odyssey by Stanley Kubrick. The video frames were sliced up and rearranged according to instructions generated by a quasi-random sequence. Doing this to film makes me think about movement in the frames, the significance of panning, zooming and the motion of the subjects. We selected the opening sequence in particular because of the intricate dance of the space-ships. Using such a famous sequence, and leaving the audio intact gives me a chance to struggle to interpret what's been done to the footage, and to think a bit more about what makes the original footage work.
This is another video I made with James Johnson-Perkins. It doesn't really work too well on vimeo at the moment. First of all, it is designed to be played in an infinite loop. The guys at vimeo are working hard to implement a loop button in their flash player. Secondly, it looks much better in HD. You can see it in higher resolution here. Unfortunately, due to bandwidth problems, vimeo do not currently allow embedding of videos in HD. But best of all, if you have a vimeo account, you can download an .avi file here, and loop it yourself.
Another collaboration with James Johnson-Perkins. This is a function somebody created in the 80s (sadly I have lost the reference). It takes an integer as input and generates a list of complex numbers and if you map the real and imaginary parts onto coordinates in a 2-dimensional real space, and connect them, you often get surprisingly ordered and recognisable shapes. Sometimes this freaks me out a little bit. I imagine that these are pictures of demons lurking in the complex plane. I showed these shapes to James and he said that I should get with the colour. Which certainly makes things less terrifying. This was shown on urban screens in cities across the UK as part of Do Billboards Dream of Electric Sheep? in 2007.
The footage in this is a few minutes from Vampyros Lesbos by Jesus Franco. The music is Toccata and Fugue in D minor by J. S. Bach. I've been sort of obsessed with problems synchronising audio and video for a while now. I made this in 2004 for example. Visualisation widgets in media players like winamp attempt to automatically synchronise video events with audio using real-time fast fourier transforms and looking at the changing frequency spectrum of the audio signal. This is a pretty nifty technique, and is used by VJs a lot also. It's probably the best you can do with an exisitng audio input, but there is a huge amount of information that the human brain can extract from audio about the timing of notes of different pitch and timbre that this kind of analysis fails to capture. This video is an attempt to get perfect synchronisation using musical notation (MIDI in this case) to generate both video and audio, using the same set of instructions, ensuring synchrony and the correct interpretation of events. This is much more restrictive than FFT methods, since you must have musical notation, not just music. But I like it anyway. And it's got lesbian vampires. Adrin Neatrour has reviewed the screening of this video at the Star & Shadow cinema Eyes Wide Open film night here.
Claire very kindly drove me down the Newcastle motorway in her trusty red car with my camera poking out of her sunroof. I did this because I wanted to make a video where video events (in this case objects and light moving out of frame) generated sound. This is another way to create perfectly synchronised audio and video, to generate the audio from the video (rather than the other way round which I tried above). I was inspired to look at things in this opposite fashion after seeing some of Guy Sherwin's films, particularly one where he filmed intermittent light and dark while walking alongside some railings with the sun shining through them obliquely onto the camera lens. He then allowed the intermittent light and dark pattern run onto the audio reading head of the projector, which gives perfectly synchronised film-generated audio.
These two videos are a good example of me being obsessed with controlling every goddamn pixel on the screen. They're made from still images. There are many more of these here. Another example of this technique, EMF was shown on urban screens across Russia in 2007 as part of OutVideo07. There is a description of how I made them here.
And finally, here are two eye movies. Try playing them both simultaneously. For a while there, this was the only bit of footage that I had. So I did many tricks to it. I like how these two examples came out. The 3D one takes a really long time to render. They were worth making, they have grossed many people out. The second one was shown as part of the DAAMN residency at Gallery Glue in Heaton, Newcastle, November 2007.