Watch the video to see how users can create a screen, move an image on the screen, zoom in and out, paint using a palette of colors, select buttons on a menu, and more.
This video explains how the system learned to identify and track the movement of fingers, in order to detect commands on the system.
Oral interpretation and language teaching's Fan Box
Oral interpretation and language teaching on Facebook
Search This Blog
Tuesday, October 18, 2011
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment