Hands as Dynamic Input Devices

Hands as Dynamic Input Devices

Hands as Dynamic Input Devices for Interactive 3D Sketching & Modelling Within Virtual Reality (2004)

Macquarie University ICS Start-Up Grant

Investigators: Kavakli, M

This project is an attempt to provide fundamental research for a competitive grant application. The objective is to develop a prototype for a Virtual Reality (VR) interface to recognise simple hand gestures and build an interactive 3D model of a sketch drawn by the user. The 3D model will be generated on-the-fly and represented taking advantage of VR equipment.

The core questions we answered are: "How do we generate 3D models of real objects by sketching in virtual reality in real-time?" and "How can we use the user's hands as dynamic input devices in a virtual environment?" This project examines a novel environment in which a designer can define the contour of a sketch using a pair of data gloves in 3D space. Both the device and the pointer incorporate 3D position sensors so that drawing primitives entered are recreated in real time on a helmet display worn by the user. In this way the device and the pointer provide a virtual "3D sketch pad" and the designer has the benefit of a stereo image. We developed a prototype for 3D Sketchpad system recognising simple hand-gestures. The system was implemented in OpenGL and C++ by integrating the head-mounted display and data gloves. I had two research programmers working on the implementation (Mathew Roberts and Dilshan Jayarathna). We reported on the system architecture of the system in our recent publications. In order to simulate a real hand, we designed a more realistic 3D hand model using one of the 3D modelling packages (3Dmax) and animated it using the real-time data receiving from the data gloves.

I administered the work for the 2 research programmers working on my ICS Research Startup Grant. Initially, we had only one Virtual Reality programmer (Matthew Roberts) in the Department. During the administration of the grant, I could achieve the training of the novice programmer (Dilshan Jayarathna) by the expert. Dilshan Jayarathna has started his Honours thesis in Virtual Reality as a result. In this research project, we used a variety of hand gestures to fulfil different requirements of sketching behaviour as a communication vehicle with the VR system. Different gestures may have different meanings to the specific tasks associated with sketching. We used simple approximation algorithms to identify sketches such as lines and curves, in order to derive a more precise model from all the sketches.

To recognize hand gestures, the system evaluates the flexing of the each finger of the hand and decides what gesture is currently using. In real hand simulation, flexing of fingers can be approximated as nearest as possible to the real hand using the flexure values between 0 and 1 inclusive. If the flexure value of a finger reaches less than or equal to 0.1, the finger assumed to be fully flexed. If the value is greater than or equal to 0.9, the finger is totally un-flexed. This way, we can define up to 32 gestures, since there are 25(=32) possible combinations using flexure values fully flexed (? 0.1) and un-flexed (? 0.9) for each finger. The flexure values also depend upon which data glove is used. More accurate gestures can be recognised, when the number of sensors per finger is high. For example Data Glove 16W has three sensors per finger, attached to each joint of each finger.

During this research study, we also found out that Data Gloves come with a set of orientation trackers that are only capable of interpreting the 2D data (position of the fingers). To be able to process 3D data (in the depth buffer) we developed a switch that tracks the motion of the hand in the 3rd dimension. We use this switch to zoom in and out in the third dimension by using a mouse or a keyboard. However, this is not ideally suited to the needs of a natural sketching interface. To eliminate this problem we need to use a motion capture suit to provide an extra set of motion trackers.

Back to the top of this page