Researchers are developing new hand gesture technology that allows users to carry out commands on computers without the need of a keyboard or mouse.
The prototype human-computer interaction technology labelled ‘Typealike’, works through a regular laptop webcam with a simple affixed mirror. The program recognises the user’s hands and gestures beside or near the keyboard, prompting operations based on different hand positions.
A user could, for instance, place their right hand with the thumb pointing up beside the keyboard, and the program would recognise this as a signal to increase the volume. Different gestures and different combinations of gestures can be programmed to carry out a wide range of differing operations.
The innovation in the field of human-computer interaction technology aims to make user experience faster and smoother, reducing the dependency on keyboard shortcuts or working with a mouse and trackpad.
‘Typealike’ technology formation
“It started with a simple idea about new ways to use a webcam,” explained Nalin Chhibber, a recent master’s graduate from the University of Waterloo’s Cheriton School of Computer Science. “The webcam is pointed at your face, but the most interaction happening on a computer is around your hands. So, we thought what we could do if the webcam could pick up hand gestures?”
The initial insight led to the development of a small mechanical attachment that redirects the webcam downwards towards the hands. The team then designed a software program capable of understanding distinct hand gestures in variable conditions and for different users. The team utilised machine learning techniques to train the Typealike program.
“It’s a neural network, so you need to show the algorithm examples of what you’re trying to detect,” said Fabrice Matulic, senior researcher at Preferred Networks Inc and former postdoctoral researcher at Waterloo. “Some people will make gestures a little bit differently, and hands vary in size, so you have to collect a lot of data from different people with different lighting conditions.”
Collecting data for human-computer interaction technology
The team recorded and collected a database of hand gestures from dozens of research volunteers. They also had the volunteers to preform tests and surveys to help the team understand how to make the program as functional, versatile and as accessible as possible.
“We’re always setting out to make things people can easily use,” concluded Daniel Vogel, an associate professor of computer science at Waterloo. “People look at something like Typealike, or other new tech in the field of human-computer interaction, and they say it just makes sense. That’s what we want. We want to make technology that’s intuitive and straightforward, but sometimes to do that takes a lot of complex research and sophisticated software.”
The researchers further explain that there are further applications for the Typealike program in virtual reality where it could eliminate the need for hand-held controllers.