11 Dec '20
Student project: animated 3D code rendering engine
I’m trying to get better at writing down potential student project ideas as they
come to me. For the moment, I’m doing this on my blog using the
As a livecoder I care more than most about how my code looks on the screen. While I’ve written in the past about the potential of animated “tooltip” style code annotations, I’d love to have more detailed (parametric) control over the display of the code/text itself, especially in a “handwritten” style. Being able to “write” the code in real-time in sync with the music in a livecoding setup would look super cool, and I 1 think that it might even help folks follow what’s going on in terms of which bits of code are responsible for which parts of the music.
Making this happen would require:
a handwriting synthesis model (something like this one by Alex Graves)
(optionally) a 3D version (which would give output like this, but automated rather than manual)
a way of applying keyword colouring & other standard code display niceties (because it’d be nice to not lose the things that a regular old IDE/text editor can do)
a protocol for a livecoding IDE to communicate (including timing information) with this code synthesis engine so that the code would be displayed/drawn (and re-drawn) in response to the music in funky ways
…and it’d all have to run in real-time during a performance (although it needn’t be running on the livecoder’s machine—as mentioned above it’d ideally use some sort of nrepl-style protocol and could be done over the network).
I have no evidence for this… but I’d like to try and get some. ↩