Instead of receiving information via smartphone, messages could one day be sent and read through a person’s skin, according to new research.
Engineers at Purdue University, working with MIT and Facebook researchers, have developed a technique that can teach people to interpret nonverbal messages through an arm sleeve that sends haptic signals, such as a buzzing sensation, to the skin.
The study found that by using phonemes, or the 39 distinct units of sound within the English language, test participants were able to interpret the buzzing signals efficiently.
“I’m excited about this … imagine a future where you’re able to wear a sleeve that discreetly sends messages to you – through your skin – in times when it may be inconvenient to look at a text message,” Hong Tan, a professor of electrical and computer engineering at Purdue University and lead researcher, said in a statement.
Tan is also the founder and director of Purdue’s Haptic Interface Research Laboratory.
What’s haptic communications?
Haptic communication refers to the way people interact via the sense of touch.
While previous research has shown that speech communication through the skin is achievable, training people to interpret messages in English through nonverbal signals has historically been difficult.
But now, the researchers have found success with a system that translates English phonemes to haptic simulation patterns on the skin.
The study
To conduct the study, the researchers selected 100 common English words, such as “ace,” “key,” “shoe,” “knee,” and “all,” and transcribed them each into the 39 English phonemes.
For example, the phoneme transcription for “ace” would be “AY” and “S.”
Then, the researchers gathered two groups of 12 subjects — one group that was focused strictly on phoneme-based learning, and another that was focused on word-based learning.
The phoneme-based learning group learned haptic symbols for 10 minutes a day, 10 days straight, using only phonemes before learning the 100 specific words presented as sequences of phonemes.
Oppositely, the word-based learning group learned the haptic symbols by learning the full words from day one.
Each participant used a material cuff that encircled the forearm from the wrist to below the elbow. The instrument featured 24 factors that, when stimulated, emitted vibrations against the skin that changed quality and position based on the phonemes it was signaling.
For example, the sounds of consonants K, P and T were stationary sensations on different areas of the arm, while vowels were indicated by simulations that moved up, down or around the forearm.
The results
The researchers found that the phoneme-based learning method was more successful and provided a more consistent path for users to learn in a short period of time.
“With the phoneme approach, one learns 39 symbols corresponding to the 39 phonemes of English, and can then receive any English word made up of a string of phonemes,” said Tan. “With the word-based approach, the smart participants figured out early on that each word is made up of symbols representing the sounds, and they learnt just as well as the participants with the phoneme approach.”
She explained that the word-based approach took most people longer to understand, which slowed down the learning process altogether.
“By then, they were a bit behind, their ability to recognize the phonemes was not solid, and their word recognition performance plateaued because they kept confusing the symbols representing the phonemes,” she said.
With the phoneme approach, the researchers found that at least half of the subjects could perform at 80 percent accuracy, with two of the 12 subjects reaching 90 percent accuracy.
Applications
The researchers predict that communications via the skin will benefit everyone — from the hearing-impaired and visually-impaired to people on the go.
“Ultimately, anyone with or without sensory deficits can wear such a sleeve to receive information on the go, especially when reading a message is not safe or convenient due to activities like driving or running,” said Tan.
The yearlong research was funded by Facebook as a means to develop new communication platforms.
“We are collaborating with Facebook through the company’s Sponsored Academic Research Agreement. Facebook is interested in developing new platforms for communication and the haptic research we are doing has been promising,” Tan said in a statement.
Tan said the next steps in the research include connecting the sleeve to an automatic speech recognizer, developing a more lightweight and wearable sleeve, and continuing to train people to learn longer strings of words.