As our capstone project at U of T, in 2012 me and my team members (Soumyajit Kundu and Owajigbanam Ogbuluijah) worked on an android application that recognizes facial gestures to control a hardware. This was not one of the projects proposed by the professors. We came up with the idea and proposed it to professor Andreas Moshovos to supervise.
The following is an excerpt from our design documentation highlighting our motivation:
Modern society is heavily dependent on technology and computers for numerous daily activities. Most of our interactions with computers are via Graphic User Interfaces (GUI) and new innovations in technology have been pushing the limits of our daily interaction experiences with our machines over the past few decades. One recent successful example would be the surge in popularity of voice-controlled personal assistants, such as Siri and Alfred, on smart phones to enhance and simplify our interactions with our cellular phones [1].
Recently, there also have been some increased interests in another class of human-computer interaction – hand gestures. Early work in this area involved glove-based devices with many physical connection points with a computer to analyze gestures. Although such devices have potential application in specialized domains such as surgery simulations, it does not improve a day-to-day computer user’s experience. One solution to this problem is visual interpretation of gestures via optical devices such as cameras [2]. One recent application of such technology was implemented in a 3D vision gesture control system by GestureTek in 2000. Using the 3D vision gesture control system as an interface, users can interact with any display screen from a distance [3]. Another recent development in this area would be the application called “Flutter”. The application works on both Mac and Windows platforms to control operations of media file players such as iTunes, VLC, Winamp, Windows Media Player etc. via hand gestures that are read through the webcam [4]. Currently, the application only recognizes one hand gesture to stop and start the aforementioned media players.
These are just a few achievements in the fairly new area of human-computer interaction. As a group of 4th year electrical and computer engineering students, we were very intrigued to be a part of this relatively new technological revolution of visual interpretation based gesture control and, therefore, decided to build a gesture control system where we can ultimately control hardware motion via hand gestures.
Despite the numerous possible applications of gesture controlled hardware design and availability of literary resources, we decided to pursue the development of gesture recognition application from scratch. Therefore, our logical approach to the design was incremental and simultaneous development of software and hardware components, and finally bridging the gap by integrating both sides. Although hardware control via gesture is a project requirement, the main premise of the proposed design is to develop a functional Android application that recognizes and interprets various gestures that can be used as a standalone application software on smartphones and can be open to further advancement in the future. The implementation of a hardware control unit, simply put, is a tangible manifestation of recognized gestures for demonstration purposes.
Check out the demo of our project:
Now this… this is good history.
LikeLike