[still in process, sorry...]
In the first chapter we created an image updated in real-time we can use as a mask. In the second chapter we used it to map a video. We now use the real-time mas in another real-life example. Our Base is a beautiful processing sketch by Amnon Owed (see here) dealing with the kinect and physics. The original sketch creates Particles which flow like bees. If a human is in sight, the particles flow inside of the human silhouette. If not they run free.
The sketch uses OpenNI to detect human silhouettes, uses blob detection to change them into a Polygon. Then does a hitDetection Particle/Polygon.
I used the particle Flow in a project with a different approach. Occasion was a CD release concert of a rock band. Amongst other things I projected the Particle Flow on the people and their instruments but not on the background*.
So I did not want the kinect to track humans (as in Amnon Oweds sketch) but all kinds of Silhouettes (Instruments!). So I could skip the human detection part as well as the blob to polygon: I just used a black/white Image like we created in the previous chapter.
*In fact in the real project I did project a lot of different things on the background also, check here. We will not talk about this here, let's just pretend there is only the particle Flow on the people.
Code:
import processing.opengl.*; import SimpleOpenNI.*; SimpleOpenNI kinect;
int distance = 1500;
int distance2 = 3000;
int speed = 4;
int chosenPalette = 0;
color ccolor;
int kinectWidth = 640;
int kinectHeight = 480;
float globalX, globalY;
color bgColor;
boolean hitMe = false;
PImage liveMap;
color[][] palettes = {
{color(0,0,0),color(255,255,255),color(200,200,210),color(150,150,170),color(160,160,160),color(230,230,240),color(210,210,210),color(190,190,220),color(180,180,190)},
};
Particle[] flow = new Particle[3250];
void setup()
{
kinect = new SimpleOpenNI(this);
if (kinect.isInit() == false)
{
println("Can't init SimpleOpenNI, maybe the camera is not connected!");
exit();
return;
}
size(640, 480, OPENGL);
setupFlowfield();
liveMap = createImage(640,480, RGB);
kinect.setMirror(false);
kinect.enableDepth();
}
void draw()
{
background(bgColor);
kinect.update();
int[] depthValues = kinect.depthMap();
liveMap.width = 640;
liveMap.height = 480;
liveMap.loadPixels();
for (int y=0; y<480; y++) {
for (int x=0; x<640; x++) {
int i= x+(y*640);
int currentDepthValue = depthValues[i];
if (currentDepthValue>distance&¤tDepthValue<distance2) {
liveMap.pixels[i] = color(255,255,255);
} else {
liveMap.pixels[i] = color(0,0,0);
}
}
}
liveMap.updatePixels();
drawFlowfield();
}
void setupFlowfield() {
strokeWeight(1);
for (int i=0; i<flow.length; i++) {
flow[i] = new Particle(i/10000.0); }
setColors();
}
void drawFlowfield() {
globalX = noise(frameCount * 0.01) * width/2 + width/4;
globalY = noise(frameCount * 0.005 + 5) * height;
for (Particle p : flow) {
p.updateAndDisplay();
}
}
void setColors() {
color[] colorPalette = palettes[chosenPalette];
bgColor = colorPalette[0];
for (int i=0; i<flow.length; i++) {
flow[i].col = colorPalette[int(random(1, colorPalette.length))];
}
}