Home » Technology » What I want to see from Apple on accessibility in 2025

Share This Post

Technology

What I want to see from Apple on accessibility in 2025

As Global Accessibility Awareness Day and WWDC 2025 approach, here’s what I hope Apple will fix — and why accessibility must be more than a promise

A digital graphic featuring an Apple iPhone and a statement about Apple accessibility 2025 on a black background.

Each May, Apple marks Global Accessibility Awareness Day (GAAD) with a series of announcements showcasing its commitment to disabled people. A few weeks later, Apple’s Worldwide Developers Conference (WWDC) sets the tone for the company’s next big software and hardware moves. As we approach these important moments, the conversation around Apple accessibility 2025 is coming into sharper focus — and it’s raising important hopes and questions within the disabled community. In this post, I want to share my perspective on what Apple must do next to truly move accessibility forward.

For all of Apple’s achievements in this space, real-world experience shows that there’s still a long way to go. Accessibility needs to be about more than features that look good in a press release — it must be about giving disabled people real independence, dignity, and equal access to the technology that shapes our lives.

Fix what’s broken before adding more

First and foremost, Apple needs to prioritise fixing serious, long-standing accessibility problems that undermine disabled users’ trust in its platforms.

Take the Apple Watch. For disabled people who rely on voice commands to interact with their devices, the Watch can be surprisingly difficult to use. Siri won’t respond unless the screen is awake — and waking the screen requires a wrist raise, tap, or button press. For people with limited movement or paralysis, these physical gestures can be impossible. The result is a device that is often inaccessible at the moment it’s most needed.

This isn’t just a minor inconvenience — it’s a matter of personal safety. The Apple Watch contains vital features for emergency communication, fall detection, health tracking, and wellbeing. These are exactly the features that disabled users could benefit from most — yet the way the device is currently designed locks many of us out.

In 2025, Apple should commit to making the Apple Watch more inclusive for severely disabled people — those who can’t perform wrist acrobatics or physically interact with the screen. Siri should be truly hands-free, responsive at all times, and able to activate without unnecessary barriers.

Accessibility isn’t just about adding new features. It’s about ensuring the existing tools work reliably, consistently, and equitably every day. Before unveiling the next wave of technology, Apple must recommit to the basics — and that starts with the people who need these devices the most.

Voice-first computing needs to leap forward

For disabled people like me, who cannot use a keyboard, mouse, or touchscreen, voice-first computing is essential for independence.

Apple’s Voice Control application does offer important functionality. Basic tasks like selecting text, editing documents, navigation, or interacting with websites are broadly possible — and for many users, this has been transformative.

But real frustrations remain, and they show that Voice Control has not evolved as it should.

Voice Control does not learn from recognition errors. If a word or command is misunderstood once, the same mistake is likely to repeat again and again — wasting time and effort.

Support for users with atypical speech is still very limited — particularly on the Mac. While iPhone users can now record the way they pronounce words and phrases to improve recognition, this crucial feature is missing entirely on the Mac, leaving many users excluded.

Even when dictating, Voice Control frequently confuses dictated words for voice commands, breaking the natural flow of dictation and making accurate communication much harder than it should be.

These are not minor inconveniences. They are fundamental barriers to autonomy, creativity, and work.

In 2025, Voice Control should become a learning, adaptive, AI-powered system— one that improves with use, understands user patterns, and offers reliable, fluid voice interaction for everyone, including those with atypical or impaired speech.

Despite the rollout of Apple Intelligence, newer third-party voice technologies like Aqua Voice and Voicett are already delivering smarter, AI-powered dictation and voice control experiences. They prove that a better, more human approach to voice interaction is possible.

Apple must now step up and transform Voice Control into a truly intelligent, inclusive platform — one that empowers disabled people to work, communicate, and live more independently.

Apple Watch displaying the ‘Hey Siri’ prompt with Siri’s waveform icon on a black screen and black background.

Better hardware support for third-party assistive tech

Accessibility today isn’t just about Apple’s own products — it’s about how well they work with third-party assistive technology.

The current state of Bluetooth and external device support often feels like an afterthought. Issues around pairing, reliability, and lack of deep integration prevent disabled people from using the setups they rely on every day.

For example, I’ve personally encountered an ongoing issue involving Ray-Ban Meta smart glasses and the iPhone. After any phone or WhatsApp call, the voice assistant features on the glasses stop working until the glasses are physically handled — something I cannot easily do due to my disability.

It remains unclear whether the fault lies with Meta, Apple, or a combination of both. I have reported the issue to both companies. However, the end result is the same: vital independence features break down.

This reflects a broader problem that others have also begun to highlight. Earlier this year, Meta CEO Mark Zuckerberg publicly called on Apple to open up its ecosystem, arguing that third-party hardware and software developers need better, fairer access to work reliably with iOS. While the debate between tech giants usually focuses on competition, for disabled people, it’s about something far more important — reliability, choice, autonomy, and dignity.

Apple must take real leadership here. It must not just prioritise its own ecosystem, but actively work to ensure that third-party assistive technologies — whether they are glasses, earbuds, watches, or adaptive controllers — integrate seamlessly, predictably, and accessibly across its devices.

The future of Apple accessibility 2025: A real test of leadership

The importance of improving third-party integration will only grow in the years ahead. A new EU law on interoperability will soon require tech companies like Apple to open up their ecosystems, including how well external hardware and software interact with their devices. It will mean easier device pairing, opening up iOS notifications, automatic Bluetooth switching, and more.

In response to regulatory pressure, Apple has agreed to implement all nine of the EU’s interoperability requirements, with initial changes beginning in iOS 18 and further improvements expected to continue into iOS 19.

However, true leadership is about more than minimum compliance. Apple now has an opportunity to go beyond what is legally required — by proactively building an ecosystem where smart glasses, adaptive devices, earbuds, and assistive technologies work seamlessly and predictably across all of its platforms.

When disabled people can trust that our tools will work, without fragility or gaps, true accessibility becomes possible.

Make Siri truly accessible

Apple Intelligence — Apple’s new personal AI system — has now officially launched, gradually rolling out in iOS 18. But the most important upgrade for Apple users — a more sophisticated, Apple Intelligence–infused Siri — is still to come.

Following growing criticism about Siri’s limitations, Apple executives have acknowledged that Siri has not kept pace with modern expectations. They have promised that a new, fully rebuilt AI-driven Siri will arrive with iOS 19 in 2025, bringing deeper understanding, better context, and more powerful task handling.

For disabled people, this upgrade cannot come fast enough — but accessibility must not be an afterthought.

Today, Siri remains painfully limited compared to what disabled people need. Too often, it fails to understand, struggles to complete tasks, or blocks users with unnecessary authentication hurdles — creating barriers where independence should exist.

One welcome improvement arrived with iOS 17: Apple introduced a feature allowing AirPods to authenticate Siri requests on an iPhone, eliminating the dreaded, disabling reply:

“You’ll need to unlock your iPhone first.”

For severely disabled users like me, and many others with upper limb disabilities, unlocking an iPhone without help simply isn’t possible.

This small change — allowing authentication through wearing AirPods — was something I personally campaigned for over many years. It represents real progress.

But it does not go far enough.

Currently, access to notifications still times out after a period. If I have to pause or am unable to respond immediately, I lose the ability to hear or interact with notifications via Siri, regaining it only by unlocking the iPhone again — something I cannot do independently.

Furthermore, for the feature to work at all, a carer still has to unlock my iPhone before putting the AirPods into my ears, making true independence impossible.

What is needed next is the option for disabled users to permanently toggle AirPods authentication on, without timeout, if they choose.

Even better, Apple should allow this setting to be controlled by a simple Siri voice command, giving disabled users full autonomy to manage authentication without needing physical intervention.

Siri-powered Dictation capabilities also urgently need to become more sophisticated.

Currently, while you can dictate a message, you cannot send it with a simple “Send this” voice command.

There are no possibilities to correct, format, or navigate within dictated text hands-free either — making it far less capable than it should be.

Personally, I have long believed that  Siri Dictation should be merged with Voice Control’s dictation system, creating one world-beating, AI-powered dictation platform embedded deeply into iPhone, iPad, Watch, and Mac.

Everyone would benefit — not just disabled people, but anyone who relies on voice for efficiency, productivity, or accessibility.

All of these improvements would help reduce the physical and cognitive load that many disabled people experience — including those with breathing difficulties, speech impairments, learning disabilities, and those who live with severe fatigue.

With the coming AI overhaul, Apple has a golden opportunity to rebuild Siri with accessibility at its heart:

  • More natural conversations
  • Better understanding of context
  • Ability to perform complex sequences of actions without scripting
  • Robust error handling when commands aren’t understood
  • Better understanding of those with atypical speech, building on improvements introduced in iOS 18.

True independence for disabled users is within reach — but it will only happen if Apple chooses to prioritise accessibility as the foundation of its next-generation Siri.

For us, Siri isn’t about convenience. It’s about independence.

Accessibility in visionOS and spatial computing

As Apple moves into spatial computing with Vision Pro and future products, accessibility must be baked in from the start — not treated as an afterthought.

To its credit, Apple launched Vision Pro with a solid foundation of accessibility features, including VoiceOver adapted for spatial interfaces, Dwell Control for eye-based interaction, Switch Control, and AssistiveTouch-style options. Several disabled testers, particularly those with visual impairments, have praised these efforts as a strong early step.

However, significant concerns remain. Vision Pro heavily relies on hand gestures, pinching, and eye tracking — all of which may present barriers for users with significant physical disabilities or fatigue-related conditions. While Dwell Control offers an alternative, real-world reports suggest it may not yet be fully sufficient for people with limited head movement or inconsistent control.

Wearing the device for extended periods can also cause physical fatigue, especially for users with muscular disabilities.

Beyond the Vision Pro itself, getting accessibility right in spatial computing is critical because it will shape the future of wearable technology.

Apple is widely rumoured to be working on regular-sized smart glasses — and if made truly accessible, they could be transformative for disabled people.

Lightweight, voice-driven, accessible smart glasses could offer hands-free navigation, communication, and interaction in ways that current devices still struggle to provide.

At WWDC 2025, I would love to see Apple:

  • Introduce even more robust hands-free options for visionOS.
  • Focus on eliminating reliance on complex physical gestures.
  • Set the standard early for how spatial computing can — and must — include disabled users.

Accessibility in spatial computing isn’t just a nice addition. It’s the foundation for ensuring disabled people are part of the future — not left behind by it.

A genuine dialogue with disabled users

Finally, and perhaps most importantly, Apple needs to build deeper, ongoing dialogue with disabled users — not just rely on internal testing or occasional outreach.

The most valuable insights come from real people facing real barriers, day after day. Accessibility must be shaped in partnership with the people who need it most.

This GAAD and WWDC, I want Apple to go beyond glossy videos and slick marketing. I want them to show they are listening — really listening — and that they are willing to act.

Because accessibility isn’t just another feature. It’s the foundation of equal opportunity in a digital world.

I’d love to hear your thoughts. What accessibility improvements would you like to see from Apple in 2025? Please feel free to share your experiences or ideas in the comments below — I’d really like to hear from you

Colin Hughes is a former BBC producer who campaigns for greater access and affordability of technology for disabled people

Leave a Reply