Earlier this month at CES 2026, Meta announced a new research collaboration with the University of Utah exploring EMG smart home control and how consumer-grade EMG wrist wearables could support people with different levels of hand mobility. Using the Meta Neural Band, the research will examine how electrical signals generated by muscles at the wrist can be translated into digital input. Importantly, this work is not limited to navigating smart glasses. The stated aim is to explore how custom EMG gestures could be used to control everyday devices such as smart speakers, blinds, locks and thermostats. As part of this work, the University of Utah team is also testing the precision of EMG input by using the wristband to steer the TetraSki, an adaptive ski designed for people with complex physi...
The arrival of a new generation of hands-free smart locks, such as the Aqara U400, is one of those moments where a feature framed as convenience quietly reveals something much more important. Using Ultra Wideband (UWB) and Apple Home Key, the lock can unlock automatically as you approach with an iPhone or Apple Watch. It represents a shift toward Zero-UI interaction: there is no tapping, no Face ID prompt, no voice command, and no app interaction. You simply arrive at your front door, and it opens. For some people, that’s a nice quality-of-life upgrade. For others, it’s a glimpse of what truly accessible smart homes could look like. Why hands-free automation matters for accessibility Smart home accessibility is still too often reduced to voice control. Voice assistants have undoubtedly hel...
Meta brought the future to my doorstep Two weeks ago, something quite extraordinary happened. Meta Reality Labs visited my London home to let me try the new Meta Ray-Ban Display glasses — the first model with a built-in display — bundled with the Meta Neural Band wristband. Officially released in the US on 30 September 2025, and priced from $799, the devices aren’t yet available in the UK— so having them brought directly to me felt both a privilege and an honour. I live with a muscle-wasting condition called muscular dystrophy, which means I have very limited movement in my fingers, wrists, and arms. Everyday interactions most people take for granted — like pressing a button or tapping a touchscreen — can be physically impossible for me. That’s why innovations like the Neural Band are so f...
Apple has acknowledged a series of bugs affecting Voice Control in macOS 26, the company’s built-in accessibility feature that allows people to fully operate a Mac by voice. Over the past week, users have reported that several key commands are not working as expected, with some issues seriously impacting productivity. After a detailed bug report was submitted, Apple’s accessibility team confirmed it is aware of the problems and has passed them on to engineering for investigation. The bugs The issues confirmed by Apple include: • “Delete that” command malfunctioning — instead of deleting the last dictated phrase, the command only removes the final character. • Invisible cursor in Mail — while dictating, the text caret disappears halfway through a line, making it impossible to see where edit...
This line from Meta’s announcement of the new Ray-Ban Display glasses with Neural Band really struck me. “Think of the potential impact it could have for people with spinal cord injuries, limb differences, tremors, or other neuromotor conditions.” I haven’t tried them yet — they don’t launch until 30 September, and only in the US and select stores at first — but as someone with very limited upper limb mobility, I can already see why this feels different. Beyond another wearable The first generation of Ray-Ban Metas already brought me into the fold. For the first time, I could wear glasses that looked like glasses, but also gave me voice control, hands-free photo and video capture, calls, messaging, and AI assistance. That was a step towards independence. But wrist wearables from Apple, Goo...
Yesterday’s WWDC 2025 keynote brought Apple’s bold new Liquid Glass design, system-wide renaming (e.g., iOS 26, macOS Tahoe 26), and a big push for on-device Apple Intelligence—available now in eight more languages, and open to developers everywhere . We also saw major updates to iPad multitasking, Spotlight, Xcode 26 with foundation LLM support, and even lighter Game controls thanks to Liquid Glass‘ fresh UI . ✅ Notable accessibility updates Announced via press release last month, Apple quietly introduced a number of genuinely welcome updates available now to try in the betas released after the keynote: Live Captions on Apple Watch: real‑time captioning during calls via Live Listen mic, controlled remotely from the Watch . Voice Control improvements: developers can now use Voice Control...
As WWDC 2025 approaches, rumours swirl about what Apple may unveil: enhanced AI features, a new naming convention for all the operating systems, and a visual glass-like overhaul, including round Home Screen icons. There’s also speculation about improvements to Apple Intelligence—a brand that, despite the hype, hasn’t exactly set the world on fire. But for disabled people like me, all eyes are on something less flashy but far more consequential: Voice Control. Voice Control today: clever, capable—and still flawed Apple’s Voice Control app, first introduced six ago in macOS Catalina, allows hands-free control of a Mac, iPhone, or iPad. It was a game-changer for many disabled people, particularly those with motor impairments who rely on dictation and voice navigation. But its limitations are ...
As WWDC 2025 approaches, Apple is widely expected to reveal a significant redesign of watchOS—part of a broader visual refresh inspired by visionOS. This could include rounded icons, translucent UI elements, and a more unified experience across Apple’s platforms. There’s also speculation that Apple will shift its naming scheme to match the calendar year, introducing “watchOS 26” instead of the expected “watchOS 12.” In terms of functionality, reports point to expanded Apple Intelligence features—though a full Siri overhaul may not arrive until later in the year. Battery-conscious AI enhancements, better contextual awareness, and possibly predictive health prompts are anticipated. Hardware-wise, rumours suggest new health tracking capabilities like blood pressure monitoring for Apple Watch ...
Apple’s WWDC 2025 keynote is just days away, and expectations are high. This year’s announcements are rumoured to include a dramatic visual overhaul—complete with rounded icons on the Home screen and a more unified interface across macOS and iOS. Apple may also introduce a new naming scheme for its operating systems, and its AI effort, Apple Intelligence, is expected to see improvements after a lukewarm debut last year. But beyond new names and fresh polish, disabled people are asking the same question they do every June: Will this be the year Apple finally fixes the accessibility gaps we live with every day? Despite its deserved reputation as a leader in accessibility and inclusive design, Apple continues to leave significant issues unresolved. As I’ve written in The Register over the pas...
macOS finally lets you record custom vocabulary in Voice Control – but Apple still has work to do Did this really just happen? Tucked away in a recent macOS Sequoia update, Apple has quietly rolled out a long-awaited feature: you can now record custom vocabulary in Voice Control on the Mac. This feature first arrived on iOS in version 18.0 last year — now, at last, macOS has caught up. No fanfare, no splashy announcement. As one prominent Apple influencer told me, “It’s quite crazy Apple didn’t tell us.” But for disabled people who rely on voice-first computing, it’s a big deal. So what’s changed, why does it matter — and where does Apple still fall short? What is custom vocabulary in Voice Control? Voice Control is Apple’s built-in speech recognition system that lets users navigate and di...