3D human–computer interaction: Difference between revisions

Content deleted Content added
Tag: Reverted
m Reverted edit by 194.83.36.2 (talk) to last version by David Eppstein
Line 59:
==== Tracking devices ====
 
3D user interaction systems are based primarily on [[motion capture|motion tracking]] technologies, to obtain all the necessary information from the user through the [[gesture recognition|analysis of their movements or gestures]], these technologies are called, tracking technologies. kamz is short
 
 
Trackers detect or monitor head, hand or body movements and send that information to the computer. The computer then translates it and ensures that position and orientation are reflected accurately in the virtual world. Tracking is important in presenting the correct viewpoint, coordinating the spatial and sound information presented to users as well the tasks or functions that they could perform. 3D trackers have been identified as mechanical, magnetic, ultrasonic, optical, and hybrid inertial. Examples of trackers include [[Motion capture|motion trackers]], [[eye tracker]]s, and data gloves. A simple 2D mouse may be considered a navigation device if it allows the user to move to a different ___location in a virtual 3D space. Navigation devices such as the [[treadmill]] and [[bicycle]] make use of the natural ways that humans travel in the real world. Treadmills simulate walking or running and bicycles or similar type equipment simulate vehicular travel. In the case of navigation devices, the information passed on to the machine is the user's ___location and movements in virtual space. [[Wired glove]]s and bodysuits allow gestural interaction to occur. These send hand or body position and movement information to the computer using sensors.