(Temporarily abandoning my plan to blog about all of my projects in chronological order, seeing as I’m now two years behind.)
Exogenesis is my new demo for Raspberry Pi and Novation Launchpad MIDI controller, written as an exercise in creating a coherent visual narrative on an oh-so-limited display. Music is by songster and zinc-vending supremo Hoopshank, and the demo was presented at this weekend’s Sundown demoparty, where it won first place in the Wild demo competition.
Visuals were programmed in Python (synchronised with the music by hand – there’s no spectrum analysis of the audio or anything like that going on) and sent to the Launchpad as a stream of MIDI ‘note-on’ events. There’s no particular reason for running it from a Raspberry Pi, other than ‘because I can’ – the code ought to be portable to more or less anything that can run Python and has a USB port.
This also happens to be my first production under the Wavesitter label. I figured that since I’ve done quite a few productions with guest musicians, and hope to do a lot more of that in the future, I should make that an official thing with its own name, rather than being “Gasman and X” all the time. In other words, Wavesitter is to Matt Westcott what Nine Inch Nails is to Trent Reznor. Or, indeed, what Simply Red is to Mick Hucknall. Insert your own comparisons here. (Also, if I’m not mistaken, it’s a literal translation of the German word for budgerigar, Wellensittich. Which I think is kind of neat.)
My woefully late roundup of stuff I’ve made continues with this project from August 2011. As long-term followers of this blog may recall, since mid 2008 I’ve been undergoing a long-term exercise of trying to keep up a jet-setting demoscene lifestyle without flying – and that can lead to some pretty creative journey planning. One such occasion was my Week Of Geek in 2011, a round trip taking in the Assembly demoparty in Helsinki, then down to Berlin (via the sleeper train from Malmö, which unexpectedly involved the train being loaded onto a ferry. A train on a boat! A train! On a boat! But that’s another story) for the first two days of Chaos Communication Camp before whizzing off by ICE to Cologne for the Evoke demo party.
All visitors to CCC received a r0ket badge – a USB-powered gadget equipped with a 70MHz ARM processor, a whopping 32Kb of flash, a joystick switch, and a Nokia-3310-stylee mono 96×68 LCD. Ostensibly, this way a way of providing people with name tags that were suitably illuminated for the night-time activities at the camp; in reality, of course, it was a geek toy to hack around with, and for me it was a perfect opportunity to earn some major geek points by being the first person to show one off to a demoscene crowd on the other side of Germany, while the camp was still going on. So, what sort of eyecandy can you do in two days with a low-res display, a joystick, and a comparatively-beefy-but-floating-point-lacking CPU? Wolfenstein 3D, that’s what.
The principle is the same as it was in the 286 days: set up your viewpoint as position somewhere on a 2D map; send rays fanning out from that point, one for each pixel column of the screen, until it hits a wall; and draw a vertical slice of texture corresponding to the part of the wall it touches, scaled according to how far away that is. I first learned of this back in my Uni days from Tristam Fenton-May, who designed a hardware implementation of the Wolfenstein engine for his final year project, and came up with a brilliantly oddball way to avoid having to allocate memory for a full display buffer: the whole thing would be rendered column-by-column on a CRT display placed on its side. Lovely.
My initial experiments with the r0ket board showed that the screen was laid out in a peculiar vertical arrangement of bytes, which was reason enough to steal Tristam’s idea and do my column-by-column rendering straight to the screen in one swoop. In hindsight, this was probably a bit silly – mingling the display routines with the 3D calculations resulted in some horribly spaghetti-ish code – but then again, I think we’re allowed a bit of spaghetti code in fun projects like this. And, in fact, the overheads of the r0ket system software meant that we only had around 2.5K to play with, so being frugal with memory was probably no bad thing.
In advance of my appearance at the Ultrachip Festival in Edinburgh next month (19th-20th August! Two nights of awesomeness from the UK’s finest chiptune musicians! Free entry! W00t!), I thought this would be a good time to reveal the secret weapon at the heart of my live shows. Ladies and gentlemen, behold… the Synchronizatron 3000.
Out of all my projects, I like this one a lot. I like it because it brought me out of my comfort zone and into the murky world of hardware design (aided by the Arduino project which does a fine job of making that world accessible to electronics noobs like me). I also like it because it elegantly solves a problem that, in all likelihood, nobody in the world but me has. But most of all, I like it because it has a pair of blinky LEDs on the top which serve no meaningful purpose. (more…)
Read on for the complete notes/transcript of the talk (in hopefully more coherent form than the talk itself – next time I promise to spend less time on the flashy demo and more time figuring out exactly what I’m going to say…) (more…)
Update 2010-06-08: Oops. In the process of testing how Safari 5 shapes up, I discovered a rather silly oversight: the audio buffering routine was set up to never use more than 10% of CPU. Now that I’ve fixed it, it turns out that Chrome and Safari (at least) have no trouble at all playing Jugi’s Dope theme in its 28-channel glory. (However, taking the brakes off the buffering does mean that we can’t reliably pause the audio any more. A small price to pay, I think you’ll agree.)
I’ve rewritten the DivIDEo converter app in pure C, and as a result it’s now available in friendly standalone Windows and Mac OS X command line executables (and slightly less crazy and Ruby-ish to compile for other platforms). All the necessary libraries (including a major chunk of ffmpeg) are compiled in, so now there’s nothing standing between you and full-on ZX Spectrum video converting action. Head over to the DivIDEo website for the downloads.
Incidentally, a couple of people have asked about the identity of the singer in the Outline presentation. Apparently, while that clip is what we sneeringly refer to as an “internet phenomenon”, it’s not quite reached 100% saturation, so: it is Edward Anatolevich Hill, with a Russian TV performance of the song “I am very glad, because I’m finally back home”, or as it’s becoming increasingly better known, Trololololo.
Six years after my first tentative attempts at streaming video from the DivIDE interface were presented at Notcon 2004, I’ve finally come up with a system that I’m happy with. It boasts 25fps playback with audio somewhere above the ‘nails in a vacuum cleaner’ quality of previous attempts (through the use of delta compression on the video data and variable bitrate audio to use up whatever processor time is left), a one-shot conversion utility that handles all the video decoding, rendering and re-packing, and a player routine that more or less respects the ATA spec (so won’t fall apart as soon as someone else tries it on a different CompactFlash card. Hopefully). Here’s how I presented it at the Outline demo party:
Anyone typing it in in its entirety would be rewarded with this:
Not bad for an evening’s work. Mind you, I did take an ever so teeny shortcut, by writing a Ruby program to convert a MIDI file to BEEP format. (Any .mid file will do, although ones with a single instrument will survive the rather primitive selective-note-butchering process better. Oh, and anything much longer than this one will exceed the 48K Spectrum memory…) And now you can try it out too:
Update 2010-05-26: Karl McNeil has adapted Midibeep into a variant called Mid2ASM, which outputs an assembler listing rather than Basic – this enables the data to be packed much more efficiently, paving the way for altogether longer pieces of music. Download Mid2ASM (453K, Windows EXE included)
Update 2010-06-02: Another update from Karl, featuring a Windows GUI, more space-saving tweaks, and embedding the output in a Basic REM statement. Download Mid2ASM v2 (3.4Mb)
Update 2011-04-09: Karl McNeil has released the last version of Mid2ASM for a while, version 3.2 – featuring primitive importing from .sid .psg and .wav, and the choice of Basic or assembly output. The source archive also contains the command-line version, midibeep2.
A new version of JSSpeccy is out. It doesn’t run at full speed on an iPhone either (although it positively speeds along on recent versions of Safari on real computers), but it does boast the following changes:
GPL v3 licenced, with prominent notices to make it clear that playing silly buggers like the above will not be tolerated (even if they do include source…)
A bit of speed optimisation (about 15% faster maybe)
A pimped-up user interface with shiny icons
And most relevantly, entirely controllable via iPhone / iPod Touch touchscreen. In principle. (If you’re expecting an immersive gaming experience, you’ll be disappointed.)
So there you go – probably the best Spectrum emulator for the iPhone ever. And it’s free.