Send a smiley with smile. A prototype earbud can detect facial expressions and convert them into smartphone controls, like answering a phone call when you wink or launching Wikipedia when you open your mouth. It could be used by people with impaired movement or as a hands-free tool for drivers.
“We’re not trying to replace current input methods, just complement them,” says Denys Matthies at the Fraunhofer Institute for Computer Graphics Research in Rostock, Germany. You’re not always able to take your phone out of your pocket or look down at the screen, but might still like to be able to pause your music or pick up a call, he says.
Matthies and his colleagues developed a prototype system consisting of an earbud kitted out with electrodes that recognise changes in ear-canal shape when you make different facial expressions. A reference electrode attaches to the earlobe with a peg. When someone smiles, for example, muscles in the ear move as well. This causes the earbud to deform and produce a detectable change in electrical field that can be mapped to the corresponding face movement.
At the moment, the system can detect five different expressions – smiling, winking an eye, turning your head to the right, opening your mouth and saying a “shh” sound – with an accuracy of 90 per cent.
“It’s currently still just a research project, but something as simple as answering a call with a facial expression could be possible soon,” says Matthies. It could also give people with disabilities more options to interact with their smartphone.
A consumer version of the product could stop the system acting on false positives – like sending a message to someone every time you smile – by taking context into account. It could look for face movements that correspond to answering or declining a phone call only if the phone is ringing, for example.
The work will be presented at a human–computer interaction conference in Denver, Colorado, in May.
Many people are looking into “socially unobtrusive” ways to interact with technology, says Daniel Ashbrook at the Rochester Institute of Technology in New York, who works on hands-free user interfaces. “It can be rude or inappropriate to look down at your phone and say, ‘Hey Siri block that call’,” he says.
Technology like this could make basic tasks less disruptive. “It takes four seconds to get your phone out of your pocket and in a position to do something meaningful,” says Ashbrook, “so anything you can do in that time with just an earbud is a big win.”