Electronic rings wirelessly connected to an AI system are capable of translating multiple sign languages into text, a new study finds.

“I believe this is an important step toward making sign language translation systems more practical, lightweight, and usable in real-world environments,” says Ki Jun Yu, an associate professor of electrical and electronic engineering at Yonsei University in Korea.

More than 300 differen tsign languages are used worldwide, and many research projects are developing translation devices for communicating with people who do not know a sign language. However, these projects have faced many setbacks.

For example, some projects used cameras and computer vision algorithms to recognize hand gestures. However, these were typically limited to controlled settings with fixed cameras, and were sensitive to lighting variations and other forms of interference.

Other devices relied on wearable sensors that detected either hand motions or electrical signals linked with muscle activity. However, a common kind of wearable sensor, smart gloves, trapped heat and moisture, making prolonged use uncomfortable, and their fixed sensors failed to account for individual variations in hand size, finger length, and joint positions, reducing their accuracy. In addition, wearable sensors often required hooking up to computers using wires, hampering hand movements. Although some wearable sensors ultimately wirelessly transmitted their data to an external processor, these still typically all connected to the same single transmitter using wires.

Better living through wireless

Now scientists have developed a set of electronic rings that each transmit their motion wirelessly to a processing device. Using rings instead of gloves permitted flexible positioning of sensors to help account for variations in people’s hands. The wireless connections allow unrestricted hand motions.

Bluetooth Low EnergySoCs [systems on chips] have reached a point where an entire wireless communication stack, power management circuit, and sensing module can fit on a flexible substrate small enough to wear as a ring,” Yu says.

In the new study, the researchers examined how much each finger contributed to hand signs, discovering that seven fingers played major roles. As such, their system only employed seven rings to reduce the amount of hardware needed.

Each ring used accelerometers as inertial sensors. These could detect both stationary postures and hand movements to help capture the full complexity of sign languages, which often involve transitions between static and dynamic components. In addition, the scientists wanted to avoid relying on bioelectric signals, which are highly specific to each person and so require extensive calibration for each user.

One challenge in developing these rings was mechanical reliability. Initially, the scientists used straight copper interconnects, which nearly broke under repeated bending. They switched to interconnects with serpentine patterns that withstand repeated flexing.

One AI to unite the rings

The researchers also developed adeep-learning system to recognize signs from hand movements. It could identify signs not just from the two people that were used to train the system, but also from five people who did not take part in the training phase. This suggests the new system could prove of general use without requiring laborious adaptation for each user.

In experiments with the five people who did not help train the system, the new system could recognize 100 common American Sign Language and 100 common International Sign Language words with 88.3 and 88.5 percent accuracy, respectively. In contrast, most previous attempts at sign language translation systems were limited to vocabularies of fewer than 50 words.

“Two hundred words is a meaningful advance over prior wireless systems, but it is still a small fraction of a full sign language lexicon, which can contain thousands of signs,” cautions Dosik Hwang, a professor of electrical and electronic engineering at Yonsei University in Korea. “I want to be careful not to overstate what the current system can do in open-vocabulary, real-world conversation.”

The new system was also capable not just of recognizing isolated words, but of translating entire sentences from continuous signing. The scientists suggest this could help support real-time interpretation.

In the long term, “our goal is to make the system work with everyday devices such as smartphones without requiring specialized external equipment,” Yu says. “The rings could wirelessly transmit sign language signals to a mobile device, where they would be automatically translated and displayed in real time. This would make the technology more portable, accessible, and practical for daily communication.”

However, “the most important caution is this —our system translates hand motion into text,” Hwang says. “It does not yet capture facial grammar, mouthing, body posture, or spatial syntax, all of which are grammatically meaningful in sign languages.” A future challenge lies in incorporating those “into a seamless, low-power architecture that maintains the unobtrusive nature of our current design,” Yu adds.

The scientists next aim to train the system with more people, larger vocabularies, and more signing styles and regional dialects, Yu says. “Given our institutional roots, Korean Sign Language is a natural next step,” he adds.

The researchers also hope to make their rings wearable all day, up from nearly 12 hours, through further miniaturization and power optimization, Yu says. “A key priority is migrating the processing pipeline from external hardware [like a laptop] to on-device edge computing [like a mobile phone]. This transition is essential not only for true mobility but also for ensuring user privacy and reducing latency in natural conversation.”

Hwang and colleagues plan to partner with deaf community organizations to develop their devices: “We believe the technology will be significantly improved both in its functional performance and its social integration by including those who will actually use it,” he says.

Beyond sign language translation, these new rings might find use in other gesture-driven applications, Hwang says. “We see immediate potential for this technology in hand rehabilitation monitoring, fine-motor assessment for neurological conditions, and even immersive virtual reality and augmented reality interfaces,” he explains. “By proving its efficacy in the complex domain of sign language, we have essentially stress-tested the system for a wide array of future biomedical and interactive applications.”

The scientists detailed their findings on 1 May in the journal Science Advances.

From Your Site Articles

Related Articles Around the Web