Physical Prototype
Materials: bendable wires and nylon threads.
Music
Idle state
There will be song playing when there is no people interacting with the installation.
This is my draft.

This is the workflow in Ableton showcasing the idle state.

The song would be like this:
Active state
One User
When there is only one user, he/she will initiate change of tracks within each four element. If he/she is in the percussion sections, the idlePerc1 would be changed to activePerc1. Based on the order of the which section he/she walks in. For instance, if his/her path is Percussion –> Verse –> Chorus –> Improv, then everything would be switched to active mode and each MIDI track will be played immediately to get user feedbacks right away.

So following the example flow, the song would be like this:
Below is the single user workflow in my Ableton.

Multi-Users
If there are multiple users, right now, I am thinking of four for the sake of the explanation. Let’s assuming each person is within one individual square of music elements, i.e.: percussion, verse, chorus, and improv. Each square would be mapped where the center point is (0,0) where holds the original tempo and no panning. When user moves up and down as the picture shown, it will increase and decrease the tempo. When the user is moving left and right, it changes the panning.
Example song soon to be updated here.
Hardware
What I tried
- Sending data OSC to Max through a package called KinectV2-OSC
- I tried to follow the instructions on my window 10 but it didn’t work. The github is written 6 years ago so I wouldn’t recommend using it
- Synapse – “Kinect to control Ableton Live, Quartz Composer, Max/MSP/Jitter, and others that receive OSC events”
- It did not work with Mac 10.15 system [But it had connection with Ableton Live which you can directly use Kinect to control some sound elements. If you haven’t updated your Mac and you wanted to try Synapse, I highly recommend it. If you succeed with older versions, please let me know!
- Microsoft Kinect SDK in Windows System
- It is quite intuitive but I didn’t know what to do until this video. I can get kinect data straight from the software which is exciting. But I was still struggling with sending the data from windows to my Mac. I asked my friends and they mentioned about if I am in the same network, i should be fine. However, I still couldn’t figure out why. Then I went to the coding lab and Billy was very helpful!
- Tutorial to follow to install a Kinect is here on Instructables
What works
Ni-Mate – works perfect before my kinect died TAT due to unstable cable connection
- Delicode Ni Mate has Dummy data so that even I don’t have the Kinect that I ordered, I can still play with it.

- Ni-mate max patch to get the kinect data

Below is an example how to work around making the velocity. We tried to use the Body/V in Kinect. However, it didn’t show in Max Console.

Next Steps
I will learn Max for Live and get the Max to control the tracks through Max for Live in Ableton.

Billy mentioned about great resource to learn Max on Kadenze. He also recommended me to learn the Life Object Model on Max for Live.
Since what I want is to have the sensor data to change some aspects of the Ableton MIDI tracks such as tempo, arrangements(play on/off), below is my general workflow:
Kinect sensor data -> Max/MSP -> Max for Live -> Determine which track to tweak around through transport
Correct me if you think the flow needs to be improved, thanks!