Thursday, 31 May 2012

Reflective Report


The ' Kinect : You : Music ' was a success and  the process I took to get there was simple. Discovering the minim library, xbox kinect and there integration possibilities, I was able to effectively augment space visually and phonically. Although it had some syncing troubles it was still provided an intersting interactive product.

I experimented with the effects, looping and Audioplayer properties of the minim library. I discovered that the bandpass filter was the most appropriate effect for the project. I also realised that the best way to loop the audio was to find where the end was and reload and replay them again. This resolved all the syncing issues with looping and helped the program run smoother.

Then I got an xbox kinect and experimented with the way it tracks movement and found that the scene function was the most effective for the project. It initialised the current scene and recorded it all as black. Then anything that moved between the sensor and the background became coloured. The change in colour was what could be measure and was the key to providing interaction in the project.

Once the audio run smoothly and the xbox was able to measure the density of people in the spaces, I then worked on a little visual display and the audio tracks. The Visual display was three rectangles with the frequency wave of the individual track in the middle of each rectangle. On site the rectangles where changed to fit the metal panels on the wall and the wave forms were moved to align with the groves of the panels. These visualisations where essential to springing the trap of interaction, once the rectangles would flash on the walls people would realise that they were in an interactive musical zone where they could not only: control the rectangles, but also the music that augmented the space they are in.

The strengths of this project are: the music firstly had a dramatic effect on the vibrancies of the space, the massive light display was a lure to the project, the program was straight forward easy to manipulate and debug, and lastly it stimulates both visual and audio senses.

The weaknesses of the project are: It has a slight sync issue and needs to run on fast computers, as the audio tracks are small and become repetitive, I only programmed three zones and instruments, the scale of the effects were not drawn out enough feeling like the effect was either on or off rather than gradually switching.

I learnt a great deal about minim and using the xbox kinect. The two most important things I discovered in this whole process was; music is an amazing tool for livening space, and interaction has to be sprung like a trap to be effective. This realisation only dawn upon me whist watching people walk through the project. Sometimes people would stop in surprise look then keep walking, but more often then not people would start interacting with the design as if their purpose was always to play with the project.   

Overall I learnt a lot about going through the whole process of concept - program - hardware - present - display. Arguable too much fun to be an assignment or perhaps  an example of a well delivered teaching method with hands on learning. This project was a good stepping stone for interesting possibilities in future Digital Design courses.

Monday, 7 May 2012

Kinect - The Xbox device of crazy happy fun times.


I have finally got myself an Xbox Kinect and suddenly the world is not so dull and my project is looking all the more probable.

To get the Xbox Kinect working on a Windows computer you will need to download and install either the SimpleOpenNI or the Windows SDK drivers. Daniel Shiffman who works very closely with processing and has developed OpenKinect, however it only runs on Mac. http://www.shiffman.net/p5/kinect/

He recommends SimpleOpenNI for windows users so this is the library(s) I downloaded. It is a straight forward process: you go to the web site http://code.google.com/p/simple-openni/ , download a bundle of drivers. and install each one IN THE ORDER SPECIFIED. The 32 bit version works very smoothly whilst the 64 bit version was a pain and discarded. Once installed I plugged in the Kinect and started playing with the examples - much fun. SimpleOpenNI can track a skeleton, which requires a calibration pose whilst tracking movement is done automatically. With depth and scene detection the initial background can be mapped (white is close black is far away) or it can be left black. In each case when something moves in the scene it is coloured and tracked.

The processing sketch created to test for movement was based of the Scene example found in the simpleNi library. It colours the initial scene black and then when something moves it is coloured. This is useful as I can test to see how much of the sketch window is not black and use that to effect the output sound of the sketch (minim audioplayer) http://code.compartmental.net/tools/minim/ . At the moment I am using a bandpassfilter, which hollows out the track and pulls the pitch up, as people move into the scene it thickens the sound out. The high pass and the low pass seem to cut out allot of the songs frequency and I do not think it is as effective as the bandpass.

Lastly I've been using a program called Reaper http://www.reaper.fm/ , an Iphone app by Native Instruments http://www.native-instruments.com/ called Imaschine and a bass guitar to create the drum/treble/bass tracks for the project. The native instruments Imaschine has an amazing library of drum and synth sounds which automatically loop. This is the greatest audio sketch pad for under $10 and has been a pleasure to use during this project. Reaper is the simplest recording software I've ever used, I am still evaluating it at the moment however for a recreational licence its around $60 - AWESOME. Reaper allows me to record 30 second sketches and copy them out into 10minute loops and then "render" or export each individual track to mp3 which is perfect for importing into and working with the minim library.

From here on in I am focusing on getting the sounds out that I want and getting them all in to the sketch. I will lastly work on a visualisation to go with the audio sketch, probably a beat detection which augments with the Xbox Kinect data. This is mainly to let the users of the space know that the sound is connected to the exhibition and that they can interact with it.