Home » Technology » Voicing concerns: the silent struggle for Siri accessibility on Apple Watch

Share This Post

Technology

Voicing concerns: the silent struggle for Siri accessibility on Apple Watch

The quest for inclusive voice control on the Apple Watch

Apple Watch showing Siri on the display

The Apple Watch, often regarded as a symbol of cutting-edge technology, pledges to bring the future to our wrists. Yet, for those with severe upper limb disabilities, this future comes with a caveat. The challenge? Siri’s accessibility—or the lack thereof.

Apple’s wizardry has made Siri smarter in recent times. With the voice assistant on your wrist, you can do everything from launching apps to replying to texts to setting alarms and reminders and much more. But waking Siri on the Apple Watch is a feat that remains out of reach for users with limited mobility.

Here’s an exploration of this issue and potential solutions.

Waking up Siri: a gesture too far

Despite Apple’s efforts to enhance Siri’s capabilities on the Watch, the brutal truth is that for users like myself, activating Siri on the Apple Watch remains an impossible task. The reliance on wrist movement to wake the screen before issuing a voice command erects an unintended barrier for those with limited mobility.

This lack of access has led me to give up on Apple’s wrist computer. I’ve owned an Apple Watch since the Series 3 in 2017, but due to the fact that without voice control it is little more than a metal and glass ornament strapped to my wrist I decided not to upgrade to the Apple Watch Ultra 2 when it was released.

Double Tap or double trouble?

Last autumn as part of watchOS 10 Apple brought Double Tap to the latest Apple Watch. The feature lets you tap your index finger and thumb together twice to perform common actions like answer a call, or reply to a message.

There was hope that the Double Tap feature, available on Apple Watch Series 9 and Apple Watch Ultra 2, could help with Siri activation by double tapping finger and thumb to wake the assistant . However, Double Tap not only doesn’t support Siri activation, it also falls short for people with limited mobility who find it impossible to use because of the same prerequisite of raising the wrist to wake the screen before activating the feature.

While some may praise Double Tap as an accessibility win, it remains inaccessible for paralysed individuals and those with severe muscle weakness, such as those dealing with muscular dystrophy or ALS. The need for alternative methods to trigger Siri without relying on wrist movement remains.

In my journey with the Apple Watch, documented over the past six years, attempts to activate Siri on the Apple Watch consistently highlight the unchanged landscape for those facing upper limb disabilities. Despite technological advancements like always on display, on-device processing of Siri requests, and blood oxygen monitoring, the core challenge remains unaddressed, hindering the full potential of the Apple Watch for some in the disability community.

The silent issue

The problem with Siri on the Apple Watch isn’t just its inaccessibility—it’s the lack of conversation about it. Many in the tech media fail to consider in their coverage of the Watch that it is one of the most inaccessible devices Apple produce. The release of the Apple Watch Series 9 and Apple Watch Ultra 2 last autumn, with Siri requests on device, has made the need for a more accessible Siri more pressing than ever.

Double tap of the Apple Watch a picture of a Apple Watch on the wrist with a hand and fingers about to double tap

A Siri that responds to accessible physical prompts

In an ideal scenario, Siri activation on the Apple Watch should transcend physical limitations. A first step could be implementing Double Tap without requiring raised wrist movement, providing a more accessible trigger for Siri activation. However, the ultimate goal, shared by many in the disability community, is an always-listening Siri on the Apple Watch, akin to the functionality on the iPhone and Mac.

Apple could also introduce an accessibility setting that allows users with extremely limited mobility to train the built-in motion sensor that activates Siri when a wrist is raised, enabling it to recognise even very subtle movements, unlike those typically used by Apple Watch users.

Balancing act: efficiency vs accessibility

While I don’t have inside information on Apple’s design decisions, one reason for Siri not always listening on certain devices, like on the Apple Watch, could be attributed to battery optimisation and resource management. Due to the limited battery of the Watch, in contrast to other Apple devices, the Siri detector operates only when the built-in motion sensors detect a raised wrist. Constantly listening for a wake word requires ongoing processing power, potentially draining the battery faster. In the case of devices with smaller batteries or those designed with a focus on energy efficiency, Apple will have prioritised preserving battery life over maintaining an always-listening feature.

Additionally, concerns related to user privacy could play a role in this decision, as a perpetually active listening mode might raise privacy issues. Striking a balance between functionality, power efficiency, and privacy considerations is likely at the core of Apple’s decisions regarding Siri’s listening capabilities on different devices.

AI: the catalyst for change

Apple CEO Tim Cook has promised to break new ground with artificial intelligence (AI) later this year. Integrating AI advancements could significantly enhance the feasibility of implementing an always-listening Siri on the Apple Watch while addressing concerns related to power consumption and privacy. Advanced AI algorithms can be leveraged to create a more efficient wake word detection system, optimising resource usage. Machine learning models could be trained to recognise the user’s voice patterns locally on the device, minimising the need for constant communication with external servers and reducing data transfer, thus preserving battery life.

Additionally, AI could play a pivotal role in enhancing the contextual understanding of voice commands, making Siri more intuitive and responsive. This could involve utilising natural language processing algorithms that learn and adapt to individual users’ preferences and speech patterns over time. This personalised approach not only improves the overall user experience but also enables Siri to function more effectively in a variety of environments and situations.

Furthermore, AI-driven privacy measures can be implemented to ensure that sensitive information is processed locally on the device, with only anonymised and essential data transmitted for server-side tasks. By embedding robust encryption and anonymisation protocols, Apple can maintain its commitment to user privacy while providing an always-listening Siri experience.

In essence, the integration of AI technologies holds the potential to revolutionise Siri on the Apple Watch, making it not only more accessible for individuals with mobility challenges but also more efficient, responsive, and privacy-centric. This approach aligns with Apple’s ethos of seamlessly blending cutting-edge technology with user-centric design.

Access to Siri a potential life saver

Beyond the accessibility challenges, it’s crucial to recognise the transformative potential of the Apple Watch for disabled people in terms of personal safety and health monitoring. The constant presence on the wrist allows for seamless tracking of vital health metrics, enhancing overall well-being.With the release of watchOS 10.2 Siri can now help users access and log their Health app data. For example:

“Siri, what’s my heart rate?”

“Siri what’s my blood oxygen?”

Almost from the time the Apple Watch was launched there have been tales of how it has saved lives, from car crashes, sea and mountain rescues, fall detection, and alerting people to dangerous heart problems, For those with limited mobility especially, the Apple Watch should serve as a lifeline, providing instant access to emergency services, care giver contacts, and real-time health data that can be invaluable in critical situations. Without access to Siri these important features are next to useless.

As someone who lives with a disability that sometimes requires blood oxygen monitoring it’s great the Watch has this feature built in but if I can’t trigger a check with Siri the feature may as well not be there.

The importance ot access to these features transcend convenience. They could quite literally save your life.

I’ve written this piece to amplify the voices of those silently grappling with the Siri accessibility challenge on the Apple Watch. By shedding light on this on-going battle, I hope to spark conversations, drive awareness, and ultimately advocate for a more inclusive and accessible future for wearable technology.

Smarter wake word detection and on-device voice recognition could pave the way for a future where technology truly knows no bounds—and no wrists need to be twisted.

The pursuit of a Siri trigger that doesn’t rely on wrist acrobatics is not merely a desire—it’s an absolute necessity.

Colin Hughes is a former BBC producer who campaigns for greater access and affordability of technology for disabled people

Leave a Reply