I have been exploring the video and computer vision in Processing and find it very interesting to engage human-computer interactions.
I work and live few blocks away from the Coolture Impact, an interactive public art platform at Port Authority Terminal. One of the interactive artwork featured recently is Stardust Wishes. It offers visitors a unique experience of this emerging art form. By moving, dancing, waving, or pointing, visitors create their own spectacular light show. Whether shooting holiday fireworks across the massive screen, effortlessly creating swirls of kaleidoscopic colors with a wave of the hand, or swaying an abstract deco cityscape of light, they are essential participants in a unique artistic experience.
Every time I walked by the art installation, I slowed down my pace and interacted with the virtual elements in the screen. Even a small movement of a simple image can trigger a lot of fun. So, I really want to make something as simple and as fun.
So, my idea is very simple. I want to make an interactive program that can turn users into a virtual object to interact with the screen.
First of all, I tried motion tracking.
The core idea of motion tracking in Processing is to go over all the pixels and look for the things that I want to track, either it’s brightness of a color or the movement between previous pixels and current pixels.
It turns out like this:
My movement was mapped out by white points, and the colorful ball moves to where the pixels are changing.
Then I tried out simple color tracking.
Colors can only be compared in terms of their red, green, and blue components, so it’s necessary to separate out these values. To compare the colors is to calculate the distance between two points via the Pythagorean Theorem, which is to think of color as a point in three-dimensional space and instead of (x,y,z) we have (r,g,b). If two colors are near each other in this color space, they are similar; if they are far, they are different.
Ok, now it’s time to add the virtual element to interact with. So, the first thing came to my mind was a ball. I can make an AR Pong game to pay tribute to my first try of programing. (Pong game is the very first computer game in human history.)
And now it’s time to turn my red pen into a paddle to hit the virtual ball:
Then I used ArrayList to create multiple virtual paddles:
To make the Pong game, I need the ball to bounce to four directions in reaction to the four edges of the square paddle, so I divided the paddle into four parts:
The code looks like this:
I also added a virtual explosion to exaggerate the ball-paddle collision.
Boom! It works!
Check out the AR Pong Game that can turn anything you have into a virtual paddle to hit the ball:
Of course there is multiple player mode:
It’s too pathetic to only play by myself. So I invited my colleague to test run with me:
OK, I am done with the pen. Now it’s time to try out turning face into a paddle.
And of course I tried to play in multiple player mode:
To add levels of difficulties, I coded a function to speed up the ball when the score reaches 100, 300, 500, and 1000.
I am very satisfied with the result. I was able to use ArrayList to create multiple squares and explosion, Computer Vision core ideas to track motion and color, functions and conditional loops to make a simple game. Yeah!