As many of you know, Radiohead is having a video contest dealie. Lance pointed this out to me back in March and suggested that I give it a go. And a go I did give.
First, however, I will discuss why I have decided not to submit my entry.
After I started working on the piece, I read the rules and regulations and reread the original post on Wired. And then I rewatched my piece. Thats when I decided to skip the contest. Its not because I don’t like the piece I made. Quite the opposite actually. But after watching and watching, I realized my piece has nothing to do with Radiohead at all. You could swap it out with any electronica song and it would actually fit a little better. My piece just doesn’t feel Radioheadish.
Also, the contest seems to focus on the notion of Storyboards and Animations. That seems to be what they are looking for: a user-made Paranoid Android type video. Something with a story. Something with characters. Something with personality. And I am afraid my piece lacks in those three categories.
But still, I do like the piece and now I will discuss why I like it.
Firstly, its got some mad crazy super duper beat detection going on. I reused the application I wrote to make the Goldfrapp piece but went a few steps further. I manually input each bass beat, snare, highhat, and arpeggio note, not to mention all the vocals and syllable breaks. Crazy! It took about 6 hours but I think it was the right way to do it. I would have wasted much more time than that had I chosen to do the beat detection algorithmically. But man, scrubbing through that track, over and over, marking every note… it was a carpal pain.
Secondly, it is Processing from start to finish. There is no post processing (oooh, a literal pun!) or editing after the fact. I import the audio data from the analysis, augment it with the direct FFT data from the Sonia analysis, press play, and after it is done, I have the finished piece. This is both a plus and a minus.
The plus is that it kinda rocks to have a full video like this created with code. An unnecessary restriction really, but a bragging point for sure. The minus is that it highlights the weaknesses in my programming. Mainly, the camera object.
I wanted a more robust camera movement in the piece but just didn’t have time to figure it out properly. I played with attaching the camera to springs and changing the target programmatically, but the end result was rather jerky and very obviously springy. Ideally, it would have more of a handheld camera feel, but I just couldn’t quite get there.
Also, the flocking movement feels less organic than some flocking experiments I have done in the past and I plan on addressing this eventually.
You can view the full 200MB Quicktime here.
Or you can watch the Vimeo version below. I recommend the Quicktime for now but once I render it out in HD, I will link to Vimeo HD version.
After reading all these wonderful comments from people telling me to submit the video, I relented and added my offering to the 700+ already submitted.