If you buy something using links in our stories, we may earn a commission. This helps support our journalism. Learn more.
Much depends on the individual and their medical history, but even with cochlear implants or hearing aids, it takes concentration to decipher speech. Some sounds and words are so similar that it’s extremely difficult to distinguish them. For people who rely on lip-reading, picking up every word is impossible. Only around 40 percent of the sounds in the English language can be seen on the lips of a speaker, according to the US Centers for Disease Control and Prevention, and that’s in ideal conditions.
The prospect of having audible speech transcribed in your field of vision is exciting. It can help people with varying degrees of hearing loss, who may suffer from social isolation as a result, to pick up more of a conversation. The XRAI app also works when watching TV, which can be handy for live content, where subtitles aren’t always great (or at the cinema, where captions are absent).
But there are some major caveats here. The XRAI app runs on an Android smartphone that must be attached via USB-C to the Nreal Air Augmented Reality glasses, which cost $379. Yep, you’ll have a wire running down your body from head to pocket. Aside from the expense, wearing glasses can be uncomfortable if you have cochlear implants or hearing aids. Although relatively lightweight for augmented reality glasses, the Nreal Air are still chunky and heavy compared to regular glasses. I can’t imagine wearing them all day.
Another red flag? One of the main reasons someone with hearing loss might want subtitles like this is for noisy environments like cafés or for group conversations where there’s a lot of cross-talk, but Feldman insists we go somewhere quiet for the demo and acknowledges that XRAI Glass doesn’t work well with background noise or multiple people speaking.
Then there’s the cost, and I’m not talking about Nreal’s glasses. The XRAI Glass Essentials tier is free and offers unlimited transcription and one-day conversation history, but if you want 10 hours of speaker attribution, 30-day conversation history, and the ability to pin the subtitles and customize the user interface, you need the Premium tier, which is free for one month then jumps to $20 per month. For unlimited speaker attribution, unlimited conversation history, and a “personal AI assistant,” you have to shell out $50 per month for the Ultimate tier. That’s a lot of money.
The idea of subtitles for real life has been around for a while. Google published research on wearable subtitles a couple of years ago and teased the possibilities of real-time translation in augmented reality glasses at its latest I/O developer event. A company video shows AR glasses translating languages in real-time and subtitling speech for the deaf. Google tells me it’s not ready for prime time, and there are issues with making the experience comfortable for people reading text projected into their field of vision.
Based on my brief demo, XRAI Glass does not solve these issues. Having to wear chunky, expensive glasses and having subtitles float in the center of your vision is not ideal. (You need a paid subscription to pin subtitles in 3D space, but I didn’t get to see this.)
…. to be continued
Read the Original Article
Copyright for syndicated content belongs to the linked Source : Wired – https://www.wired.com/story/xrai-glass-caption-ar-glasses-first-look/