[Kinect] Sky touch 3D objects / Hand tracking
So as a viewer for 3D objects I thought that sky-touch would be a nice tool to use. That means that you would be waving in the sky (remember Minority Report) and the 3D object on the screen would move/zoom/rotate accordingly. Later, i would like to incorporate 3D-3D vision. With that, I mean that I would project this on a 3D screen. So then, you can virtually move 3D objects in the space around you, while you are looking at the object in 3D.
There are many obstacles to overcome, because waving in the sky is open to interpretation, so it needs to define commands. But that is for later.
The goal is to be able to move objects that have been photographed from different angles. For now i have chosen to rotate only a model, since i have not yet been provided with enough photographic round-shot material. I guess i would need a hundred or more pictures from different angles in order to produce something useful.
So this is the result of my first test