Home » Technology » WWDC 2025: Why Apple Watch still falls short on accessibility

Share This Post

Technology

WWDC 2025: Why Apple Watch still falls short on accessibility

Apple’s wearable is sleek and powerful—but for disabled people, it remains stubbornly reliant on touch.

Close-up digital render of an Apple Watch with a dark strap and silver casing, set against a white background with colourful “WWDC25” text to the left.

As WWDC 2025 approaches, Apple is widely expected to reveal a significant redesign of watchOS—part of a broader visual refresh inspired by visionOS. This could include rounded icons, translucent UI elements, and a more unified experience across Apple’s platforms. There’s also speculation that Apple will shift its naming scheme to match the calendar year, introducing “watchOS 26” instead of the expected “watchOS 12.”

In terms of functionality, reports point to expanded Apple Intelligence features—though a full Siri overhaul may not arrive until later in the year. Battery-conscious AI enhancements, better contextual awareness, and possibly predictive health prompts are anticipated. Hardware-wise, rumours suggest new health tracking capabilities like blood pressure monitoring for Apple Watch Series 11, although development delays could push some features beyond 2025.

And yet, for all the talk of form and fitness, there’s still deafening silence around something far more fundamental: accessibility.

Last year, I wrote for The Register about my own experience as a disabled user—someone who has worn an Apple Watch since 2017 but increasingly struggles to use it. The problem is simple but devastating: Siri still isn’t accessible. And when you live with severe upper limb disability, that makes the Watch practically useless.

Siri remains out of reach

Despite advances in on-device processing and machine learning, activating Siri on Apple Watch still requires raising and moving your wrist to wake the screen. For millions of users, that’s effortless. For people like me, it’s impossible.

Apple introduced Double Tap in watchOS 10 to let users perform actions by tapping index finger and thumb together—but even that requires wrist movement and doesn’t support Siri activation. In my Register piece, I called this out for what it is: a gesture-based feature that excludes the very people it’s supposed to empower.

Siri on the Apple Watch is supposed to be a hands-free lifeline. In reality, it’s gated behind physical interactions that many disabled people cannot perform.

AI could change everything—but will Apple act?

The irony is that AI could be the very tool that solves this problem. Smarter wake word detection and battery-efficient voice activation are already being developed by Apple’s rivals. Google’s Pixel Watch 3, for instance, introduced a “Loss of Pulse” AI feature, suggesting that background processes no longer have to drain a wearable’s battery.

An always-listening Siri on the Watch, paired with personalisation through on-device learning, would make a profound difference. It could empower disabled users to access emergency services, log health data, control smart devices, or simply ask a question—without needing to move their wrist or touch the screen.

If Apple Intelligence really is central to WWDC 2025, then it’s time Apple ensured that intelligence is also inclusive.

The Watch is locked by design

Even beyond Siri, Apple Watch lacks a system-wide Voice Control layer, unlike macOS or iOS. There’s no way to issue voice commands like “scroll down” or “tap reply.” There’s no persistent listening when the Watch is resting on a charger or worn passively. And AssistiveTouch, while useful for some, requires a level of motion many disabled people cannot achieve.

This all contributes to a simple reality: Apple Watch is still a touch-first device, even in scenarios where touch isn’t possible.

A lifesaving tool, made inaccessible

Apple loves to market the Watch as a lifesaving tool. And for many, it is—detecting falls, alerting emergency contacts, even notifying authorities in crisis situations.

But for disabled users like me, that promise rings hollow. Without reliable, accessible ways to control the Watch—especially via voice—we’re locked out of those life-saving features.

In my living room, the Watch is no more useful to me than it would be if I were stranded on a mountain. And that’s not just frustrating. It’s dangerous.

What WWDC 2025 must deliver

If Apple wants to make good on its accessibility commitments, here’s what we need to see next week:

  • Always-listening Siri, without needing wrist movement
  • Voice Control-style navigation across watchOS
  • Accessibility settings that allow users to train subtle movements as activation gestures
  • Consistent support for health features via voice alone

Apple’s accessibility team has achieved remarkable things on iPhone and Mac. But when it comes to the Watch, the gap is still huge—and growing.

Conclusion: Ten years on, we’re still waiting

Ten years after its launch, Apple Watch is more powerful than ever. But for disabled people, it’s still not accessible in the ways that matter most.

WWDC 2025 is Apple’s chance to change that—not with hardware alone, but with software that finally treats hands-free access as essential, not optional.

Because when your hands don’t work the way others’ do, your voice is everything. And if your Watch won’t listen, it might as well not be there at all.

If you’re interested in Apple’s wider accessibility challenges beyond the Watch, you can read our full WWDC 2025 accessibility wishlist here.

Colin Hughes is a former BBC producer who campaigns for greater access and affordability of technology for disabled people

Leave a Reply