This month, a group of stroke survivors in B.C. will be testing a new technology designed to aid recovery and ultimately regain use of limbs and hands.
Participants will wear groundbreaking new “smart gloves” that can track hand and finger movements during rehabilitation exercises overseen by Dr. Janice Eng, a stroke rehabilitation expert and professor of medicine at UBC.
These gloves integrate a sophisticated network of highly sensitive sensor yarns and pressure sensors woven into a comfortable, stretchy fabric, allowing even the smallest hand and finger movements to be tracked, captured and transmitted wirelessly.
“These gloves allow us to monitor a patient’s hand and finger movements without a camera. We can then analyze and fine-tune the exercise program for the best possible results, even remotely,” says Dr. Eng.
Precision in Wearable Devices
UBC Electrical and Computer Engineering professor Peyman Servati, PhD student Arvin Tashakori, and a team from startup Texavie created a smart glove for their collaboration on a stroke project. Dr Servati highlighted several of the innovations outlined in the paper published last week. natural machine intelligence.
“This is the most accurate glove that can track hand and finger movements and grip strength without the need for motion capture cameras. Thanks to the machine learning model we developed, the glove can accurately determine the angles of all finger joints and wrists, including: . This technology is extremely accurate and fast, able to detect small stretches and pressures and predict movement with at least 99% accuracy, matching the performance of expensive motion capture cameras.”
Unlike other products on the market, the gloves are cordless, comfortable, and can be easily washed after removing the battery. Dr Servati and his team have developed an advanced method to manufacture smart gloves and related clothing locally and at relatively low cost.
Augmented reality and robotics
Dr. Servati envisions a smooth transition of the gloves to the consumer market through continuous improvement in collaboration with various industry partners. The team also sees potential applications in virtual and augmented reality, animation and robotics.
“Imagine being able to accurately capture your hand movements and interactions with objects and automatically display them on the screen. The applications are endless: enter text without a physical keyboard, control a robot, or use American Sign Language as a text-to-speech voice. “You can translate in real time, making communication easier for people who are deaf or hard of hearing.”