Meta's Smart Glasses to Unlock Facial Expression Detection for People with Disabilities
A new wearable device, Aleye, is being showcased at CES as a game-changer in accessibility. Developed by Hapware, this haptic wristband pairs seamlessly with Meta smart glasses to decode facial expressions and other nonverbal cues of people around the wearer.
The Aleye device uses computer vision capabilities from Meta's Ray-Ban smart glasses to stream video of conversations to an accompanying app. The app then employs an algorithm to detect facial expressions, including smiles, frowns, and more. Users can customize which expressions they want to detect in the app, making it intuitive for people to learn and distinguish between patterns.
The device has been tested with promising results. According to Hapware's CEO, Jack Walters, early testers have learned a handful of patterns within just minutes. This innovation could significantly improve communication for individuals who are blind, low vision, or neurodivergent.
A key feature of Aleye is its haptic feedback system. The wristband features bumps on the underside that vibrate in specific patterns to correspond with facial expressions and gestures. These vibrations provide tactile cues, helping users understand the emotional state of those they interact with.
While the device offers a unique approach to accessibility, it's not without its limitations. A spokesperson noted that the Meta AI-powered vocal cues can be distracting and may require constant prompting from the user.
Aleye is now available for pre-order, starting at $359 for the wristband alone or $637 for the wristband with an app subscription. The device aims to empower individuals who struggle with nonverbal communication, opening doors to new connections and relationships.
A new wearable device, Aleye, is being showcased at CES as a game-changer in accessibility. Developed by Hapware, this haptic wristband pairs seamlessly with Meta smart glasses to decode facial expressions and other nonverbal cues of people around the wearer.
The Aleye device uses computer vision capabilities from Meta's Ray-Ban smart glasses to stream video of conversations to an accompanying app. The app then employs an algorithm to detect facial expressions, including smiles, frowns, and more. Users can customize which expressions they want to detect in the app, making it intuitive for people to learn and distinguish between patterns.
The device has been tested with promising results. According to Hapware's CEO, Jack Walters, early testers have learned a handful of patterns within just minutes. This innovation could significantly improve communication for individuals who are blind, low vision, or neurodivergent.
A key feature of Aleye is its haptic feedback system. The wristband features bumps on the underside that vibrate in specific patterns to correspond with facial expressions and gestures. These vibrations provide tactile cues, helping users understand the emotional state of those they interact with.
While the device offers a unique approach to accessibility, it's not without its limitations. A spokesperson noted that the Meta AI-powered vocal cues can be distracting and may require constant prompting from the user.
Aleye is now available for pre-order, starting at $359 for the wristband alone or $637 for the wristband with an app subscription. The device aims to empower individuals who struggle with nonverbal communication, opening doors to new connections and relationships.