Home » Technology » How I influenced Apple’s Siri updates and what other accessibility features I’m hoping for in 2024

Share This Post

Technology

How I influenced Apple’s Siri updates and what other accessibility features I’m hoping for in 2024

How a simple email to Apple made Siri more accessible and other accessibility Improvements I would like to see

Apple accessibility wish list 2024

In iOS 17.4, currently in beta testing, there is a new option to set ‌Siri‌ to read incoming messages in a specific language like French, Spanish, German, and many more.The change does not update the primary language that ‌Siri‌ speaks in and responds to, but is limited to message playback.

This is something I highlighted to Apple in an email last August citing an example of how adding other languages to Siri would be helpful. I have a Polish carer and we message in Polish but all my other contacts message me in English. This was causing issues with Siri and messaging. Obviously, I am delighted my plea hit the right spot at Apple!

Email to Apple last August 2023 highlighting the issue of communicating via iMessage in a language other than English. IOS 17.4 add languages to Siri and messages

Email to Apple August 2023

Siri message playback in a different language is not an accessibility feature per se, but I highlighted the problem to Apple as a result of an accessibility issue communicating with my carer with the Messages app. I am not surprised that Apple has introduced languages to Siri message playback as it is helpful to everyone who lives a multi-language life.

This experience has got me thinking about what other accessibility features I would like to see in 2024, which I believe will benefit not only disabled people, but also anyone who uses devices, including the iPhone, Mac and Apple Watch. Join me as I examine potential updates that could make a substantial difference in the accessibility landscape for Apple users.

Automatic message translation with Siri

Whilst reading messages out in a specific language is a useful feature, I would love to see Apple go further this year using AI to introduce two-way automatic translation with specific contacts when messaging with Siri. For example, my carer is Polish and we communicate in Polish but if I use Siri to dictate her a message, I have to dictate in English and when she replies, she replies in Polish. Adding two-way auto language translation when using Messaging with ‌Siri on a per contact basis would be amazing, and I think popular with a lot of people living multi-language lives.

Accessibility toggle for authentication with AirPods

Matthew Cassinelli article highlighting the part Colin Hughes played in the iOS feature AirPods authentication for Siri request

matthewcassinelli.com

When iOS 17 was released last autumn, I was glad to have contributed to a new feature that allows AirPods users to authenticate Siri requests without unlocking their iPhone. This independence enhancing feature extends the accessibility and convenience of Siri.  Now the assistant doesn’t ask you to unlock your phone when your hands are busy, or unable to use your hands as the result of disability.

However, as useful as this new feature is, it times out after a set period rendering messages inaccessible to those who can’t pick up and open our iPhones. To solve this issue it would helpful if Apple introduced an accessibility toggle that overrides the timeout.

Voice authentication

This year I would like to see Apple introduce voice as an alternative method for authentication. As welcome as the Siri authentication feature is in iOS 17, I need to ask my carer to open my iPhone first as she puts my AirPods in my ears for me, but we sometimes forget this crucial step. I can’t be spontaneous and just leave home on my own with my AirPods in. I need to ask a carer to perform another step so notifications are read out to me when the iPhone is locked.

It would be immensely beneficial if this obstacle could be addressed through a solution like voice authentication, or another method that enhances the accessibility of the authentication process.

Play command for voice messages in messaging apps

At the moment, if I receive a message in iMessage, or WhatsApp, that contains a voice message Siri just says: “John has sent you a voice message”. The message can’t be played without getting your iPhone out and pressing play. It would be so useful if users could just say “play “ and the voice recording is played out through AirPods

Access to Siri and Double Tap on Apple Watch

I am concerned that the new Double Tap feature on the Apple Watch 9 and Ultra 2 isn’t accessible to everyone. Ironically, the feature started out as an accessibility feature but I feel it needs more consideration by Apple engineers.

For people who struggle with lifting the wrist due to severe upper limb weakness or paralysis, utilising Double Tap is impossible to use. You need to raise your wrist and wake the watch screen before the feature will work. This limitation is reminiscent of the challenges posed by hand gestures in Assistive Touch on the Watch.

Accessibility and on-device Siri on the Apple Watch  showing a man in a wheelchair wearing a Apple Watch

Even Siri activation doesn’t offer alternatives; you must raise your wrist to wake the screen before saying “Hey Siri.” This renders the Apple Watch the most inaccessible device in Apple’s lineup. It’s disheartening because the Watch holds immense potential to enhance accessibility, independence, personal safety, and health for severely disabled individuals, given its constant presence on the wrist without the need to be picked up.

I created this video five years ago to illustrate my inability to activate Siri on the Apple Watch due to insufficient muscle power. Raising my arm to wake the screen and get Siri’s attention has consistently proven challenging.

Regrettably, even with the latest Apple Watches, nothing has changed in the past five years. Siri on the Apple Watch remains just as inaccessible to me today as it was half a decade ago.

It’s unfortunate that many people are unaware of a fundamental Siri roadblock on the Apple Watch for individuals facing similar challenges. This issue is not prominently highlighted anywhere, which is particularly disappointing this year with Siri becoming on-device on the Watch.

Some YouTube reviewers have praised Double Tap as a great accessibility feature. However, for paralysed individuals and those with severe muscle weakness, muscular dystrophy, ALS, etc., it is not an accessible option.

Ideally, I would prefer to activate Siri solely with my voice, without the need to lift my wrist. As a first step, iDouble Tap without requiring raised wrist movement could serve as an accessible trigger for Siri activation, and would be a positive start. Nonetheless, my ultimate preference, shared by many with severe upper limb disabilities, would be an always-listening Siri, akin to the functionality on the iPhone and Mac, eliminating the need to raise the wrist.

Improved dictation with Voice Control and Siri

In August of last year, I highlighted several issues in an article concerning Apple’s accessibility application, Voice Control. This built-in app for iOS and macOS enables users to dictate, navigate, and control their devices through voice commands. Introduced by Apple in 2019, it replaced Dragon, a third-party voice dictation software that discontinued its Mac version.

A significant source of frustration revolves around how Voice Control manages proper nouns, a concern that also extends to dictation with Siri. Users have proper nouns in their contacts, interact with various companies, and have friends with foreign names. Both Siri and Voice Control encounter difficulties with proper nouns and capitalisation, lacking any options to address this issue.

In Android 14, users can train the Google Assistant to recognise how they say and format a specific contact’s name. A potential solution for Apple devices could involve incorporating a toggle in Contacts, or Voice Control vocabulary settings, when adding a proper noun. This toggle could allow users to specify their preference for capitalising proper nouns consistently, even within a sentence. AI could enhance this feature by identifying patterns where certain words or names are consistently capitalised.

This brings me to the question of why Apple maintains two voice services/apps: Voice Control and Siri. Given the considerable overlap between the two, it seems logical for the teams responsible for them to collaborate. As an accessibility user, I find more utility in Siri than in the Voice Control app designed specifically for people like me to use. Combining the efforts of both teams could result in a more inclusive and cohesive user experience.

Better microphones and noise cancellation

Since iOS 15 was introduced two years ago, there is a feature buried in the Control Center that instantly improves the quality of your microphone during calls, whether audio-only or on video. It’s called Voice Isolation, and it works on most iPhones, iPads and Macs.

I think this technology should be extended to dictation, whether you are dictating with Siri or Voice Control on iPhone and Mac. Background noise can have an adverse effect on the accuracy of dictation. I believe extending the technology to dictation apps will lead to an improved user experience, particularly in noisy settings. The effect would be magical for those who are currently frustrated by the accuracy of dictation at the moment.

The quality of microphones in devices like the iPhone, iPad and Mac is another area Apple should focus on this year. There are some grounds for optimism. Reliable Apple analyst Ming-Chi Kuo recently reported that the microphones in the iPhone 16 range will be markedly improved as part of Apple’s plans to integrate more AI into Siri more effectively. If this is the year of improved Siri with AI it’s important the assistant can hear your prompts wherever you are.

Conclusion

The suggested features and improvements outlined in this article have the potential to significantly enhance accessibility and user experiences generally across various Apple devices. In essence, these suggestions collectively present an opportunity for Apple to further prioritise and advance accessibility features, making a meaningful impact on the lives of users with diverse needs and abilities.

AI is set to play a big part in this endeavour. Last week, Apple boss Tim Cook promised that Apple’s AI announcements are coming “later this year”. I expect AI to be massive for accessibility, and for users who rely on Siri and voice to get things done on Apple devices.

While Apple is a company that listens to its disabled users, and has a strong record in the field of accessibility, individual needs vary greatly. Not all the suggestions outlined in this article will be relevant to every disabled person. With this in mind, what improvements would you like to see on Apple devices in 2024? Let me know in the comments.

 

Colin Hughes is a former BBC producer who campaigns for greater access and affordability of technology for disabled people

Leave a Reply