[Kinect] Sky touch 3D objects / Hand tracking

So as a viewer for 3D objects I thought that sky-touch would be a nice tool to use. That means that you would be waving in the sky (remember Minority Report) and the 3D object on the screen would move/zoom/rotate accordingly. Later, i would like to incorporate 3D-3D vision. With that, I mean that I would project this on a 3D screen. So then, you can virtually move 3D objects in the space around you, while you are looking at the object in 3D.

There are many obstacles to overcome, because waving in the sky is open to interpretation, so it needs to define commands. But that is for later.

The goal is to be able to move objects that have been photographed from different angles. For now i have chosen to rotate only a model, since i have not yet been provided with enough photographic round-shot material. I guess i would need a hundred or more pictures from different angles in order to produce something useful.

So this is the result of my first test

[youtube=http://www.youtube.com/watch?v=LRnHU1Z8OKU&w=560&h=345]

Tim Zaman

MSc Biorobotics. Specialization in computer vision and deep learning. Works at NVIDIA.

You may also like...

1 Response

  1. mohitsnggg says:

    hey bro..
    i m doing my graduation from mumbai university, i gt hand tracking as my asignment. i tried a lot but nt able to get any help..
    pl help me… i m suppose to implement my assisgnment before this weakend.. please help me…plz.. if possible mail me this project.. plz save me.. plz