Radio Inspire

How To Learn Sign Language

Implement Hand Gesture Recognition With XRDrive Sim | AI News | Intel Software


I’m David Shaw and
this is AI News. Imagine what it would
be like to operate a computer using hand gestures. Sound familiar? Maybe you saw the
concept in a Tom Cruise movie, Minority Report. Even though the movie
is based in 2054, you might be surprised to know
this technology is available today. Enter XRDrive Sim, a 3D
augmented steering wheel controlled by hand gestures. The steering wheel is
used to navigate around roads in a virtual environment
for driving school training simulations. The hand gesture
model was optimized using the Intel distribution
of OpenVINO Toolkit model optimizer. Deploying deep learning networks
from the training environment to embedded platforms for
inference is a complex task. It introduces many
technical challenges. To overcome these,
the Intel distribution of OpenVINO toolkit
inference engine was paired with an
OpenCV back end. This interface was used to
control the mouse and keyboard in a 3D racing game. The results of the
XRDrive Sim indicate that the model enables
successful gesture recognition with a very low
computational load, thus allowing a gesture-based
interface on Intel processors. Check out the
article, where you’ll learn the experimental
setup process, how to define hand gestures, and
even how to optimize the model. Additionally, run
the XRDrive Sim demo by downloading the code from the
GitHub repository in the links. Imagine what you could build
with hand gesture interfaces. Read the full
article on the links and I’ll see you next week.

Leave a Reply

Your email address will not be published. Required fields are marked *