I dust off my dormant Pokémon save file and watch as I’m dropped back into the marshes of Pastoria City, stood stock-still in front of block-headed NPCs. It’s been months since I played Shining Pearl on the Nintendo Switch. I never completed the game, and I’ve just decided to pick it up again.
I’m wearing a pair of Envision’s smart glasses. They use artificial intelligence and the power of speech to help visually impaired people perceive their surroundings. While there’s a laundry list of things that they can do, I saw another use for them that wasn’t officially advertised: Help people with low vision play video games by reading menus and dialogue.
In 2013, I lost my central vision to a rare genetic disease called Leber’s Hereditary Optic Neuropathy (or LHON for short), something that made it difficult to recognise faces, read text and distinguish colours. At the time, accessibility in gaming hadn’t developed to any significant level, but it’s come on leaps and bounds in the last few years.
Text-to-speech, customisable controller buttons, magnification and colour inversion were introduced to the Sony PlayStation 4 and Microsoft Xbox One in 2015. Games are now being recognised for their innovation in accessibility at The Game Awards, and last month, Naughty Dog announced that the Last of Us: Part 1 remake will be receiving 60 new accessibility features, including cinematic audio description – the first for any PlayStation game ever.
But the Nintendo Switch has lagged behind other consoles. It doesn’t have system text-to-speech, it only added zoom to the console in 2019, and settings for customisable controls and button remapping for Nintendo’s official controllers were only integrated in 2020. While Mario Kart 8 Deluxe and Nintendo Switch Sports have a couple of accessibility settings, text-heavy games, such as the main-series Pokémon games and Animal Crossing: New Horizons are almost impossible to play.
That’s where the Envision glasses could potentially make a difference. A startup based in the Netherlands, Envision began life in 2018 as a smartphone app, before being turned into a smart glasses wearable in 2020. They were made commercially available earlier this year and given a big upgrade.
Using artificial intelligence, they promise to make life more accessible for anyone with a visual impairment, helping people recognise text, objects, faces, colours and more with just a glance. But while there are a hundred life-enhancing benefits of Envision’s specs, I wanted to know if these glasses could help me play video games again.
Envision glasses: From £2,091, Letsenvision.com
Making inaccessible games accessible
Over the years, blind gamers have learned to adapt inaccessible games to fit their needs, with many building elaborate set-ups in order to get around the accessibility shortcomings on the Nintendo Switch and its first-party titles. Some hook their console up to their computer using a capture card, get their screen reader to analyse the text using optical character recognition and then listen back as a synthesised voice reads that text aloud – a fairly cumbersome process.
I’ve instead been attaching an iPad to a tablet tripod, positioning it in front of the TV and letting Microsoft’s free Seeing AI app run amok with its short text feature, which basically reads any detected text out loud.
Neither option is massively elegant. With the capture card workaround, you have to continuously hit shortcuts on your keyboard to recognise text, while the Seeing AI method requires some perfect positioning, and it stops recognising text if you linger on a static screen for too long, requiring me to take it out of the tripod or restart the app again.
Gaming is supposed to be easy, after all. We all want to power on our console, pick up some joy-cons, sit back and relax, but that’s more challenging when you have a visual impairment.
When I take the Envision glasses out of their felt box and place the titanium arms over my ears, I’m basically wearing the Google Glass, but without the Google Glass software. Yep, the relic of a gadget, which birthed the word Glasshole, is very much still kicking around, but it’s an enterprise gadget now. It’s a device aimed at businesses, rather than individuals, and so happens to be the ideal platform for an AI-based text-to-speech and image recognition product thanks to its camera and speaker set-up.
I’m essentially using a hands-free version of the Envision app, except with more features, supported voice commands and tap gestures. The features are very similar to Envision’s biggest competitor – the OrCam MyEye pro (£4,218, Orcam.com), which won the CES 2022 product innovation award.
After being guided through a verbal tutorial teaching me the tap and swipe gestures performed on the side of the glasses, I’m able to start recognising the world, and more importantly, Pokémon Shining Pearl.
Swiping forward on the touch panel on the frames a couple of times, I reach the “Instant Text” feature. What it does is read every bit of text it finds as I look around. As soon as I double tap to launch Instant Text, an incessant stream of words fall out.
“Grunt, gweh, persistent pest. Grunt, gweh, persistent pest. Grunt, gweh, persistent pest,” it blurts out loud over and over again as I look towards the TV and engage one of the members of Team Galactic. Annoying for some – potentially, but honestly it’s exactly what I’d been looking for.
There’s no pause button, so it just keeps talking, but as I move my character around on the screen, open up menus and battle other trainers, the benefits become increasingly clear. I start building a mind map of where certain menu items are on the screen with the help of the glasses, luxuriating in the fact that I can actually read what moves my Pokémon are trying to learn when they level up, and I don’t have to restart an app because it’s stopped talking.
It reads Shining Pearl route titles when they appear on screen, narrates full segments of battle dialogue without missing a beat, and does it all in perfect sentences no less. It’s sometimes tricky for artificial intelligence to recognise and read text, with speech sometimes coming out garbled or just missing out words completely, but Envision recognised text extraordinarily well, and it did it fast.
Having an in-built screen reader would be the ideal scenario, but for those who have spent years finding workarounds in order to make games and consoles more accessible, this is far and away the most elegant solution I’ve found for my needs.
Instant text and beyond
Reading text on a TV screen and battling through menus and reams of dialogue is far from the only thing that these smart glasses can do though. They can read bank notes with accuracy; identify colours and detect light.
There’s also a scene description mode, which snaps a picture of what I’m looking at to provide a surprisingly detailed outline of my surroundings – “a desktop computer on top of a desk”, it tells me as I pop my head into the office, though I do wish it worked a little bit more like Instant Text and gave a constant narration of what I was looking at.
And when artificial intelligence fails, because sometimes it does, then there’s a really nifty video call feature called “Envision Ally”. It lets me connect to a trusted friend or family member, giving them a window’s view into my world and allowing them to see exactly what the Envision glasses see.
The feature is almost a hybrid version of the Be My Eyes smartphone app – which helps connect visually impaired people to sighted individuals when a little assistance is needed – and Aira, a paid subscription service which connects trained agents to visually impaired people when they’re lost or need a bit of visual interpreting. There’s always this uneasy relationship between privacy and autonomy with assistive tech, but it’s the prices many of us pay for independence.
I can register familiar faces into the companion app so that when it detects them in my path, it announces their name out loud. Granted, facial recognition tech is always a little creepy, but the feature is admittedly very useful for someone with central vision loss, especially if I’m trying to pick a friend out in a crowd.
The glasses also feature something called object recognition. It provides the wearer with a list of objects that it can detect and recognise – whether it’s a pet, a chair, a bicycle, a car or a traffic light. While I wouldn’t solely rely on it to help me cross roads, the use case is there, and it’s a feature I’ve ended up using more than I’d expected to. It reminds me of Apple’s ultra-wideband AirTag key finder device, the glasses make a beeping sound when the object is right in front of you.
I’m also able to pair some wireless earbuds to the glasses so that speech is piped right into my lugholes rather than out loud. But when these are on a significantly low volume, I was still able to hear the speech, and others couldn’t.
Of course, these things aren’t cheap – assistive technology never is, with most gadgets costing thousands of pounds, just take a look at the OrCam MyEye and eSight glasses. But while the Envision glasses are still expensive, they are one of the cheaper bits of equipment I’ve seen.
The glasses with the titanium frames cost £2,091, while the glasses fitted onto designer Smith Optics frames cost £2,337. Envision also promises two years worth of updates for free, with any subsequent updates costing £100. Would I buy these if I just wanted to play games on the Nintendo Switch? Perhaps not, but the glasses do more than reading text off the TV.
They’re also not the sexiest specs in the world – Google Glass has never been a good-looking product, but they do the job. I’d be more inclined to wear the Smith Optics outdoors than the titanium model, which honestly look pretty dorky. Props to anyone who can pull them off.
Founder Karthik Kannan does tell me that he perceives the device working on other AR-based smart glasses in the future because it’s a platform rather than a dedicated device – whether that’s the forever-rumoured Apple augmented reality glasses or Google Glass’s spiritual successor, already confirmed to be in the works by Google (and will hopefully look like a pair of sunglasses). They’re still a little bit too cyborg for me to feel comfortable wearing them in public, but if fitted into a nice pair of frames, consider me sold.
The verdict: Envision glasses
Ultimately, Envision’s glasses are incredibly useful for a whole range of scenarios, yet my favourite feature remains the simplest – instant text. As I walk down the street with the feature toggled on, I read shop signs again, I read door numbers, I read train times on a board, I read street signs without having to get up close. And at home, I’m reading the dialogue in a Pokémon game, and it’s not a stressful experience. All it takes is just a glance.
If you’re looking for discounts on technology or video games then try one of these codes:
Virtual reality is surprisingly accessible for visually impaired gamers, we’ve rounded up the best VR headsets we think you should check out