Hand Gestures in CAD Systems

Tumkor, S., Esche, S. K. & Chassapis, C.
Proceedings of the ASME International Mechanical Engineering Congress and Exposition IMECE'13, San Diego, California, USA, November 13 - 21, 2013.

Abstract

When designing products, networked computers are increasingly used to facilitate the collaboration among team members from remote locations. Design visualization plays a critical role in understanding the design concepts shared by the design team members. CAD systems have 3D visualization capabilities that are designed to help users to understand complex structures easily and to design better products. However, 3D visualization on a 2D screen has limitations in communicating complex structures. Furthermore, gestures play a significant role in face-to-face communication but are missing in remote collaboration. Object manipulation in 3D using gestures in the real physical space without a cumbersome hand-held peripheral device may ease the visualization and understanding of the concept model. Today, peripheral devices for human-computer interfaces are not only becoming more advanced but also less expensive. Some of the game controllers are now equipped with depth cameras (RGB-D cameras) and have a great potential for complementing or even replacing the traditional keyboard and mouse interface with hand or body gestures. In this paper, new low-cost mixed reality devices for 3D user inputs and visual outputs are investigated and their possible uses with CAD systems in the process of collaborative design are discussed.

In this study, a hand tracking system built based on two Kinect sensors has been used to track the position and orientation of the user’s hands. In this system, CAD models can be manipulated and disassembled using hand gestures. The new user interface provides for a user-friendly interaction between designer and CAD system.