Radio Inspire

How To Learn Sign Language

Kinect 3D gesture recognition providing model navigation – Natural User Interface


Interactive Mendeleev table Application startup Kinect calibration
User tracking is previewed at bottom of screen Navigation gesture
Left hand near hip, right hand movement causes navigation through model Face recognition
Detected faces of people passing by are visualised in table Zoom gesture
Both hands placed above hips, distance between hands zooms in and out model Lack of gesture
Both hands placed near hips, causes no model navigation Model navigation
Application enables exploring high-resolution and large visualisations User interaction
Gestures performed by user are recognised and classified against set of reached gestures Master thesis 2011
Department of Computer Architecture http://www.eti.pg.gda.pl/katedry/kask/ Author: Bartek Nowakowski http://bnowakowski.pl
Professor: PhD MEng Tomasz Dziubich http://www.eti.pg.gda.pl/katedry/kask/pracownicy/Tomasz.Dziubich/

2 Replies to “Kinect 3D gesture recognition providing model navigation – Natural User Interface”

  • Cool demo. Maybe you should enter the Kinect gesture recognition demo competition at CVPR 2012 ($10,000 in prizes and up to $100,000 in licensing offered by Microsoft). See gesture.chalearn.org/dissemination/cvpr2012.

  • Competition is in US though and there could be a transportation problem but I could give it a try. Thanks for info!

Leave a Reply

Your email address will not be published. Required fields are marked *