views
Researchers including two of Indian origin at Massachusetts Institute of Technology (MIT) have developed a computer interface that can transcribe words that the user verbalises internally but does not actually speak aloud. Electrodes in the device pick up neuromuscular signals in the jaw and face that are triggered by internal verbalisations -- saying words 'in your head' -- but are undetectable to the human eye. The system consists of a wearable device and an associated computing system. The signals are fed to a Machine Learning (ML) system that has been trained to correlate particular signals with particular words.
AlterEgo: Interfacing with devices through silent speech
"The motivation for this was to build an IA device -- an intelligence-augmentation device," said Arnav Kapur, a graduate student at the MIT Media Lab who led the development of the new system. "Our idea was: Could we have a computing platform that's more internal, that melds human and machine in some ways and that feels like an internal extension of our own cognition?" he added. Kapur is the first author on the paper. Pattie Maes, Professor of Media Arts and Sciences is the senior author and he is joined by Shreyas Kapur, an undergraduate major in electrical engineering and computer science.
Also Read: Flipkart Apple Week Sale Last Day: All Discounts, Cashback on iPhones, iPads, AirPods And More
The device is part of a complete silent-computing system that lets the user undetectably pose and receive answers to difficult computational problems. The idea that internal verbalisations have physical correlates has been around since the 19th century, and it was seriously investigated in the 1950s. One of the goals of the speed-reading movement of the 1960s was to eliminate internal verbalisation, or subvocalisation, as it's known.
Also Read: Airtel Launches 300 Mbps Fiber-To-the-Home Broadband Plan For Rs 2,199
According to Kapur, the system's performance should improve with more training data, which could be collected during its ordinary use. "We're in the middle of collecting data, and the results look nice," Kapur said. "I think we'll achieve full conversation some day." The researchers have described their device in a paper presented at the Association for Computing Machinery's "ACM Intelligent User Interface" conference.
Also Watch: Samsung Flip First Look | Convert Any Space Into Smart Meeting Room
Comments
0 comment