Design a site like this with WordPress.com
Get started

Nature and us – Process Update

Process

  1. Finish the web prototype – DONE
  2. Work on MAX/Msp for prototype – After final
  3. Connect with Kinect through web – DONE
  4. Connect with Kinect through MAX/Msp – AFTER final

This post lays out the process, problems and solutions I encountered for the project. I decided to not use Max/MSP for the final and replace the Kinect with webcam so that users can interact with it even when they are at home.

I fixed problems in the Tone.js and also fixed the fetch api, cors anywhere heroku app till 11/28/2020. I will be working on changing the sun api to ISS api to make sure the longitude and latitude changes frequently enough to select different field recordings during this week.

Problems

Tried to use merge to create multichannel using two songs.

Can I use .connect to connect multiple things?

Below distortion does not work.

Also does not work.

toDestination() does not work. I replaced it with toMaster() but not sure if that is the right way to do.

How to figure out the length of the time of the audio clip?

I moved the tone variables from preload to setup and update the variables in draw.

Grain player not working as well.

Elements

sun_azimuth – angle of the sun towards the earth.

According to Wiki, the sun’s position is almost south at 10 am on 11/1/2020.

So I want to use the sun angle data to determine the fade in and fade out time of the audio clip. The angle/360 would be the fade in and fade out time for the clip.

Installing kinect on my windows

Instructions on Instructable

This is me testing my kinect on kinect studio.

working with Kinect

Didnt work for Kinect2 – issue with python

cloned kinect-js. currently having issue with importing and exporting

converted to json file – audio link – should experiment on printing out the data

question for office hour

  1. how to get the audio length?
  2. can you have two players playing at the same time? one in left and one in right channel and do merge? (crossfade)
  3. can you do pingpong for clips?
  4. how to create sampler based on the clip I get from player? – line 112
  5. why toDestination does not work
  6. how to get kinect data in the js?

after office hour

  1. the order for the effects matter in player. chain(xx,x,xx,Tone.Master);
  2. I can play two or more players. Just have to connect with the master and set the autostart to be true if I want to start it/them.
  3. try kinectron instead to send kinect data to js – you can also send the data to multiple computers
  4. sampler in tone js is to mimic keyboards
  5. to create clips for sounds, you can just use player
  6. channel can be for further development – for panning
  7. control the fadein fadeout time to fade into the second clip
  8. if to make a melodic music given certain clips, I can use the following bandpass filter/ comb filter – resonance  OR transpose it and then put it in sampler 

using load json, it works

kinectron is working

drawing the skeleton using kinectron and p5

struggling in showing the two canvas together with separate js files

continuing to work on getting kinect data and map it

getting the joints data

To figure out the positions of each joint, I searched the official document.

kinectron.SPINEBASE = 0;
kinectron.SPINEMID = 1;
kinectron.NECK = 2;
kinectron.HEAD = 3;
kinectron.SHOULDERLEFT = 4;
kinectron.ELBOWLEFT = 5;
kinectron.WRISTLEFT = 6;
kinectron.HANDLEFT = 7;
kinectron.SHOULDERRIGHT = 8;
kinectron.ELBOWRIGHT = 9;
kinectron.WRISTRIGHT = 10;
kinectron.HANDRIGHT = 11;
kinectron.HIPLEFT = 12;
kinectron.KNEELEFT = 13;
kinectron.ANKLELEFT = 14;
kinectron.FOOTLEFT = 15;
kinectron.HIPRIGHT = 16;
kinectron.KNEERIGHT = 17;
kinectron.ANKLERIGHT = 18;
kinectron.FOOTRIGHT = 19;
kinectron.SPINESHOULDER = 20;
kinectron.HANDTIPLEFT  = 21;
kinectron.THUMBLEFT = 22;
kinectron.HANDTIPRIGHT = 23;
kinectron.THUMBRIGHT = 24;

For a while my chrome didn’t play sounds anymore. It shows the error

Then I fixed it with this code which I called in setup. And then I deleted it since it shows getAudioContext is not defined since I did not add p5.Sound library. And then everything went back to normal.

Reference link

Spent roughly an hour on the tone.js sounds. Here is a short sample of the sound.

getting the API from Aporee Radio

something related to the CORS policy

Then I discovered something that I am trying right now

I tried to use the cors-anywhere heroku app and realized that there were limits for the request.

Then I searched on google and found this link with the issue.

Then I followed this github repo to create my own proxy.

Steps:

  1. fork the repo
  2. connect with Heroku
  3. deploy on Heroku
  4. replace the line of code on the repo for this project
  5. it works – When I try to send more than 10 requests within one minute, it does not show the error Too Many Requests

UPDATES after 12/1/2020

Discussion with Luke:

Do not know how it does not play the

const player = new Tone.Player("https://tonejs.github.io/audio/drum-samples/breakbeat.mp3").toDestination();
 player.loop = true;
 player.autostart = true;
 function setup(){
   createCanvas(windowWidth, windowHeight);
 }
 function draw()
 {
 }

Emailed Luisa for an office hour. And scheduled meeting with Schuyler tmr at 1 pm to figure out how to save the files to the server.

finally got to play the audio

Called the recording link in draw and concat the link with the proxy address.

Created the tone player in setup but keeps updating the url

struggled with

It is working proerly. I will switch kinect to posenet to make it more accessible

Advertisement

Published by Yiting Liu

NYU ITP '21

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: