These wild AI-powered glasses can read your own lips

AI GLASSES
AI GLASSES

A team of scientists at Cornell’s SciFi Lab has developed a pair of glasses that can read your very own lips, without your having to utter a single sound.

In 1993, Seinfeld based an entire episode on the perils of lipreading, culminating in George’s misinterpreting “sweeping together” for “sleeping together.” Outside of pop culture, the art of lipreading has fascinated psychologistscomputer scientists, and forensic experts alike. In most cases, experiments have involved someone reading someone else’s lips—or in the case of lipreading programs like LipNet or Liopa, AI reading a human’s lips through a phone app. But a different kind of experiment is currently unfolding at Cornell’s Smart Computer Interfaces for Future Interactions (SciFi) Lab.

There, a team of scientists has devised a speech-recognition system that can identify up to 31 words in English. But EchoSpeech, as the system is called, isn’t an app—it’s a seemingly standard pair of eyeglasses. As outlined in a new white paper, the glasses (purchased off-the-shelf) can read the user’s very own lips and help those who can’t speak perform basic tasks like unlocking their phone, or asking Siri to crank up the TV volume without having to utter a single sound. It all looks like telekinesis, but the glasses—which are kitted out with two microphones, two speakers, and a microcontroller so small they practically blend in—actually rely on sonar.
Over a thousand species use sonar to hunt and survive. Perhaps the most popular among them is the whale, which can send pulses of sound that bounce off objects in the water then bounce back so the mammal can process those echoes and build a mental picture of its environment, including the size and distance of objects around it.