A few things happened this week. Jack Schulz and Matt Jones of berg stopped by the sva studio to lead some students in a weeklong workshop. It was amazing. We made nonsensical product drawings, re-imagined the average household thermostat as something you might see in a book about cooky Japanese inventions by making paper prototypes, and did a bit of technological material research to feed our craft. This was all amazing, and well worth me losing a bit of time to focus on thesis.
Nonetheless, I still managed to hit a major milestone in my thesis prototype. I was able to calculate the 3-dimensional velocity a user’s right hand (chosen somewhat arbitrarily… I’ll eventually be performing these calculations on all the virtual joints). I wired that velocity directly to the note velocity parameter of my Max/msp patch et voila!
First Gestural Mappings: Velocity
Also, after speaking with the fine gentlemen of berg about my thesis, they gave me a lot of great feedback and encouraged me to consider how I could get this idea across in a video, without having to explain it. i.e. How could I do something like this:
Now, admittedly, that’s a simpler device than what I’m constructing. But you get it almost in the first instant. You get why it’s great by smile on her face and the way she starts adding her own performative gestures, which really making this gadget sing (forgive me). I don’t need to know how it works. I just see that it does.
How can I do that?