LiveCoda: Wednesday the 24th of May @ Loop Bar, 23 Meyers Place Melbourne.
Presenting real-time programming madness as teams from across Melbourne battle for $600 worth of prizes, with live visual feedback on Loop's 23 foot screen. Also featuring live music, with Simulus improvising over a diverse range of electro-acoustic software, and the VS Chorus Crew bringing freestyles, accapella beatbox breakdowns and instrumental grooves. Live coding from 5pm. Music from 8pm.
Information and registration at www.esci.org.
Real-time interactive music system programming is exactly where I'm coming from and I agree that it would be difficult to find enough people to do music programming live. It is for that reason I'm working with programming standard languages. But the evening definitely has a computer music focus - if you have not heard simulus and are in melbourne then you should definitely come and see them.
I think that keyboard and mouse performance can hold peoples attention if they understand what is going on and if you can "move" (perform) fast enough: mouse and keyboard is pretty awkward/slow to work with. I suppose LiveCoda, using collaborative coding algorithms, is looking at how fast multiple people can code a solution.
Best regards,
Rob
Aha: Ross Benecia and Tim Kreiger. Ross studied music tech at La Trobe just before I did, and afaik we had the same supervisor (Jim Sosnin). Tim was one of my teachers long ago when I was at the ANU, that dome I linked to above was mostly made by a teacher we have in common (David Worrall).
Tim and I don't get on too well, but this has got me interested enough to come along anyway.
When I first saw this I thought it meant real-time interactive music system programming and I was wondering where on earth people were hoping to find teams of musician/programmers... Not the case- pity though, I've seen it in a couple of concerts, using Forth based languages. One was in a geodesic dome with 16 channel surround sound system http://www.avatar.com.au/worrall/domes/dome1.html
I think this is different kind of real-time though: what I saw was people adding code to systems while they were running and algorithmically generating music- the performers "piloted" the real-time music generation by adding code on the fly.
Musicians have found that performance with qwerty keyboard and mouse really doesn't hold people's attention, which is part of the reasoning behind things like the hyperinstruments project http://www.media.mit.edu/hyperins