AMES Scientists Working on Silent Communication Technology

Thursday, March 18th, 2004

NASA scientists have developed a system to read the nerve signals associated with speech. From AMES Scientists Working on Silent Communication Technology:

The National Aeronautics and Space Administration researchers have found that placing small button-sized sensors under the chin and on either side of the Adam’s Apple could gather nerve signals that can be processed by a computer and translated into words.

‘What is analyzed is silent, or subauditory, speech, such as when a person silently reads or talks to himself. Biological signals arise when reading or speaking to oneself with or without actual lip or facial movement,” NASA scientist Chuck Jorgensen said.

In their first experiment, Jorgensen’s team created special software that recognized six words and 10 digits that the researchers repeated subvocally. Initial results were an average of 92 percent accurate.

I don’t believe the system recognizes words per se, but letters that can spell out words — at least going by this quote:

“We took the alphabet and put it into a matrix-like a calendar. We numbered the columns and rows, and we could identify each letter with a pair of single-digit numbers,” Jorgensen said. “So we silently spelled out NASA and then submitted it to a well-known Web search engine. We electronically numbered the Web pages that came up as search results. We used numbers again to choose Web pages to examine. This proved we could browse the Web without touching a keyboard.”

I love the hypothetical scenarios they have to fabricate to justify their research:

If perfected the system could allow injured astronauts to control their spacecraft or other machines without using their hands.

Yeah, that’s a common problem.

Leave a Reply