[back to classes]

here i track my electronic music performance / NIME progress

final post this is the end. i'm posting a bunch of things that should've been posted a little while ago and should document where i am.

NIME multi-touch photoset on flickr

multi-touch block diagram (hardware and software)

multi-touch flow ending thoughts on my manifesto

i like my manifesto. i feel like it's better articulated than anything i've argued in class. after seeing people's projects in class, especially will's (the glorious soundlamp), i'd like to add a clause that i implied but wasn't specific about:

be connected
performances in this "genre" are made up of many discrete pieces: sound/music, video, physical spectacle, overarching concept, etc. to make the most impact, connect these elements as much as possible - make the sound that should be there when the lights go wild. we said this, for the most part, in class, but it's so important that i couldn't let it go.

future of performance?

we were asked, "What is the future of performance?" there is no one future. at the moment, we have more performance power than we know what to do with. the future will be molding this into a multitude of things - audience-involving performances, installations, traditional concerts, etc. this isn't a radical change, either; the history of performance has been a cycle of technological development and refinement into art, in the various spaces of performance. we're lucky to be at the beginning of a new technological cycle.

class 2 my project, concisely
i am doing research in multi-touch interaction. my first step is to build a multi-touch screen based on FTIR (frustrated total internal reflection), something that has been by many people already, and has already made appearances at NIME. this will be finished soon, and I already have written one musical application with it in mind (a ball-bouncing 3D environment, created for installation in the NY Hall of Science museum, intended to (a) be fun and make great sound, and (b) teach kids a little bit about physics). because of the physical nature of the screen, it lends itself more to an installation scenario than a performance one. I am embracing that, and want to make applications that are simple and immediately accessible to non-musicians.
i also plan to take new directions with the technology. some possibilities:
yesterday, a co-conspirator and I attempted to build a frame for the acrylic. long story short, the frame cracked as we were building it (out of wood) from using screws that were too big; the wood that we carefully mounted 17 IR LEDs in is a bit warped (and wouldn't be an issue had the frame not cracked); and we scratched the nice $100 piece of acrylic. dammit.
on the positive side, the wiring for the LEDs went well, and we figure we can cut the scratched part of the acrylic off and have a smaller screen (and buy another $100 damn piece of plexi).
we have a renewed interest in using aluminum to build the frame. anybody have a spare drafting table we can hack apart?


the ever-popular jeff han video:

more to come...

class 1 in class
we went over what the class is going to be. since it's so short (only five classes left now!) we're not going to be doing much technical stuff in class, it seems; more discussion and examining other people's work.

my least-favorite performance
this isn't my least favorite performance, but it sticks out in my mind as a particularly bad one. i saw luke dubois perform a few years ago (probably winter '05?) at the tank's space on 27th st. (still there? they're always moving around). i had little introduction to what he was doing, so i'll present my experience as i understood it at the time.

i walked in, and people were mostly seated. it was quiet... not as quiet as, say, the stone, but still quiet - definitely gallery-style rather than club-style. my friend and i took a seat behind a pillar, where we saw that we could somehow see where luke was going to be sitting, and where the projecter was idling on the wall. he simply had a laptop and projecter (perhaps a wacom tablet too..? it wasn't memorable if he did use it). we timed our arrival well, because he walked up onto stage (sans fanfare) a few minutes after we sat down.

in his inimitable style, luke awkwardly nodded at the crowd a bit, and started playing. the projector that had been black before now showed a purple blobby shape, and the room filled with a very buzzy, glitchy sound. it didn't change much in pitch, as far as i remember - there was little use of melody. it was very harmonically rich, and changing all the time. after a minute it became clear that the sound and video were closely coupled (luke didn't say anything about how it was done). every ~8 minutes, a "song change" (ha) would happen - the shape would drastically change color and shape, the sound would drop out for a second, and come back sounding "the same but different," as in a different fundamental, and perhaps some other small changes, none of which were memorable.

this went on for about 50 minutes, when it seemed like a song change was about to happen, but luke closed the lid of his laptop instead. people clapped, he walked off, and an MC said something to the effect of "yay luke! somebody else is next." i was thoroughly bored, and had something else to attend to, so i left.

notice how distant and uncertain this recollection sounds? that is exactly how it feels in my memory. i was bored, uncertain of what was going on, and even though there was technically plenty of stimulation (video and audio) there wasn't much on which to focus my attention.

i suppose it's time for a disclaimer: i love laptop music. i do it myself (my duo freedom and fesponsibility plays fairly often, just 2 guys with laptops making sound, and projection once in a while. no alternative controllers, no fireworks, no gimmicks... our max patches are SUPER-ugly too!). i also love simple droney music and often set up my synths to do a complex drone that i listen to for hours without changing.

that said, luke's performance was lacking in so many ways. i've since learned that he is creating NURBS with direct control and controlled randomness, and then he scans across the surface of these shapes and uses them to directly make audio. in fact, his reasoning (i learned these things from a guest lecture by him and talking randomly a few times) for this concept is that he's sick of the disparity between the physical shape of the laptop and the sound being produced. cellos, on the other hand, have their shapes dictated by necessity, and their shape has a direct effect on the sound. this video work is a response to that.

the problem, though, is that he didn't tell the audience that! i was wondering what he was doing the whole time - was he changing numbers on a synthesis algorithm, and running a jitter patch that visualized the sound? or the other way around? or a combination? or were both the video and audio controlled by him?

because i didn't know these things, and the sound offered so little to focus on, i just got angry as the performance went on. it's something i often worry about when thinking about my own performances with freedom and responsibility. do we make sense? are we boring the hell out of people? should we talk about what we do (we don't)? i don't worry because we don't base our music on a concept; we just improvise and do whatever sounds great. i think a lot of artists in this vein could stand to lose some BS baggage and just make great sounds.

my manifesto
(this may reiterate some points i made above; sorry, i think they're very important)
that's it. very simple. these (honestly) are the three things i keep in mind while programming max patches, recording samples, rehearsing, and playing on stage. i don't worry about presentation (although i "might could"); i let the sound stand on its own.
(and boy, do i sound conservative. weird)

Synthesis and Control on Large Scale Multi-Touch Sensing Displays
by phil davidson and jeff han pdf

the most important thing this paper touches upon is haptics. touch screens are amazingly versatile input devices, but they sorely lack physical feedback. "tables" approach a solution by requiring interaction with pucks and blocks on top of the touch screen surface, but they lack tension - imagine an interface imitating a set of knobs. pucks offer no physical resistance, like a knob might.
the paper, though, seems to decide that virtual pucks can compare to physical ones on a table. while, of course, on-screen pucks can be infinitely more versatile than a physical puck (they can zoom, slide around algorithmically, reshape, etc.), they don't serve the same purpose as a physical item.

one thing I found interesting is their performance claims. they say they get 50hz with a 2mm accuracy on a 36x27 inch screen. this really means they can capture and process about 50 frames per second at 640x480. my software has almost the same performance, in the alpha stage (read: little optimization). i feel good.

Mary had a little scoreTable* or the reacTable* goes melodic2
by sergi jorda and marcos alonso pdf

this paper's most interesting discussion isn't really about touch screens or tables, but about the use of pitch in new electronic instruments. the year before, at nime, perry cook "stated that when building a new controller, one of the first things he would try to do is play the simplest song such as 'Mary had a little lamb.'" it makes one wonder - why should an instrument need to be able to do this?

my project
i'm doing my second prototype multi-touch screen. the first used the IR-bounce method, which is quite low in resolution. this version uses FTIR (frustrated total internal reflection), which is absolutely amazing technology. (version 3 will be this FTIR display built into a nice case, hopefully mobile, and multi-mode. more later)

what i've been working on
i've gotten a lot done over the weekend, hardware- and software-wise.