AHNE – Audio-Haptic Navigation Environment

This is a demonstration video of AHNE – Audio-Haptic Navigation Environment.

It is an audio-haptic user interface that allows the user to locate and manipulate sound objects in 3d space with the help of audio-haptic feedback.

The user is tracked with a Kinect sensor using the OpenNI framework and OSCeleton (github.com/​Sensebloom/​OSCeleton).

The user wears a glove that is embedded with sensors and a small vibration motor for the haptic feedback.

This is just the first proof-of-concept demo. More videos coming soon.

This entry was posted in Projects and tagged , . Bookmark the permalink.