Home » Technology » macOS 10.16 Wishlist: 3 ways Apple can improve accessibility

Share This Post

Technology

macOS 10.16 Wishlist: 3 ways Apple can improve accessibility

From logging in with your face to better microphone tech for more accurate dictation there is a lot Apple can do to improve accessibility on the Mac in 2020

Apple MacBook

Please note: when Apple unveiled the latest version of macOS at WWDC 2020 in June it bumped the version number up to macOS 11.0

A couple of weeks ago Apple invited select developers to attend an accessibility webinar ahead of WWDC 2020. It is said to not be presaging anything WWDC-related as the company does this kind of thing as part of its ongoing work around disability.

Apple says its webinar is about how developers can improve their apps by supporting accessibility features. Developers will be able to interact with company engineers during the online event for the first time:

“Apple believes that technology is most powerful when it empowers everyone. Join us for an online event to learn how you can take advantage of the award-winning accessibility features that come standard on Apple devices. You’ll be able to ask questions during and after the sessions, and sign up for individual consultations”, the company said in its invite.

Apple_macbook_pro-13-inch-with-affinity-photo_screen_05042020_big.jpg.large

If Apple wants to help developers improve their apps by supporting accessibility features it could start at home by allowing developers of WhatsApp, Facebook Messenger and other messaging services to have API access to its Announce Messages with Siri feature. This would extend the accessibility of third party apps by allowing messages to be read out automatically by Siri and the option to dictate a reply. This would be very helpful for people who can’t use their hands to interact with their device. The new Google Pixel Buds 2, when paired with some Android phones, offers this through its spoken notifications mode. It would be great to see this kind of functionality come to messaging apps on Mac computers.

Setting aside the fact that Apple is notorious for not playing well with developers, with both the strict limits it places with its APIs, and the fact entire groups of accessibility apps are sometimes wiped out because of API changes, its Worldwide Developers Conference 2020 is soon to be held in June as a “completely new online experience” due to COVID-19. Apple usually unveils updates to its operating systems at WWDC, and then releases them to the public in September or October following a series of developer and public betas.

With macOS 10.16 the next major release for Mac computers, I thought now was a good time to take a look at some of the features I’d love to see to help improve accessibility on Mac computers this year.

There are loads of general things I would like to see in macOS 10.16, (for example, better AirPod support, iPhone unlock, and Siri Shortcuts on the Mac would be good), but many are small tweaks not worth making a song and dance about. Here’s a list of the three biggest and most far-reaching features I hope to see in macOS 10.16 that will really improve accessibility.

Note: I have a severe physical disability with normal speech. Therefore, these are the software features and improvements,(and some new hardware), I’d like to see for people who want more hands-free control of their Mac computer.

1) FaceID authentication

This simply must happen! Windows Hello compatible laptops have had hands-free facial recognition login for several years and, strangely, whilst Apple has brought Face ID to the iPhone and the iPad, it is yet to bring the technology to the Mac. It would be great to think that the company is working on new high-quality cameras that can be used to offer FaceID login authentication on its next MacBook, which will streamline the process and allow us to sit down and get to work right away hands-free. It would also make authentication much easier, encouraging us to spend via Apple Pay for example. All this would also be a massive boost for people who find it difficult to interact with their Mac computer with their hands because of physical disability, such as having to type in passcodes to login. I’ve seen Windows Hello in action on a Microsoft Surface Pro laptop and it really is a game changer for accessibility.

2) Improved Voce Control

Last year at WWDC Apple unveiled Voice Control its biggest accessibility initiative ever. Voice Control is an accessibility feature for people who can’t use normal methods of text inputs on Mac computers, iPhones and iPads. It has two main functions; it allows people to dictate emails or messages with their voice, and secondly navigate their screen with commands such as “open Safari” and “quit Mail”.

Voice Control is great for navigating a Mac computer just by voice commands but it really falls down when it comes to accurate dictation, which makes it frustrating and not at all productive to use. Apple claims Voice Control allows people to “dictate seamlessly into any text box”. This is far from the case as the software fails to recognise many words that are dictated. It’s a sad indictment that as a quadriplegic I do not use it as the application constantly fails to convert my spoken words into text accurately.

I have had to resort to using a Windows virtual machine on my Mac computer and running Nuance’s leading Dragon speech recognition software to get things done. This workaround on a Mac should not be necessary.

Some of Voice Control dictation’s poor performance may be due to the fact that only US English is powered by the Siri speech engine for more advanced speech recognition. Apple hasn’t said when UK English will be added but I am hoping it will be available with the release of macOS 10.16 later this year. It’s a pity it’s taking the company so long to add UK English to Voice Control.

Apple really needs to up its game in macOS 10.16 with a beefed-up speech-to-text engine, which makes dictation significantly faster and more accurate across a range of accents.

For more accurate dictation Mac computers are going to need better far field microphones that will pick up voices better for more accuracy in dictation. Anyone who has tried dictating with the built-in microphones in MacBooks, for example, will know how poor recognition is. I use a high quality USB table microphone instead as the built-in microphones on my MacBook cannot be relied upon.

It is disappointing that Apple does not focus more on microphones and the important role they in improving the accuracy of dictation. It could optimise Airpods for dictation on the Mac but it would need to provide more battery time for this to work effectively. In comparison, Nuance spends a lot of time advising on microphones, provides microphone calibration settings in its software, and even rates different microphones for performance. Apple says very little about the role of microphones and accurate dictation.

Apple should also look at integrating Voice Control into more of its hardware and software such as being able control Apple TV and peripherals with your voice in tvOS 14. For people who cannot press buttons, or flick switches, and rely on voice control, Amazon and its Fire TV range of devices have the lead at the moment. Apple really needs to catch up for accessibility and strategic business reasons.

3) Smarter Siri

In 2016 with MacOS Sierra, Apple’s voice assistant made the leap from the iPhone to the Mac. At first the implementation required you to click your mouse or the keyboard but MacBook Pros introduced in 2018 or later offered hands-free ”Hey Siri” activation. This has been a welcome improvement for people who cannot interact with the MacBook keyboard and trackpad with their hands.

Access to intelligent assistants like Siri is key to mainstream consumer appeal nowadays, helping users multitask and get things done. For example, while you work on an email, you can ask Siri to turn the lights or the heating up without having to stop what you’re doing. And for physically disabled users who have difficulty typing on a keyboard, or using the trackpad, the hands-free voice capabilities of Siri are just so liberating.

However, Siri still needs a lot of improvement. It still lags behind Google Assistant and Alexa in its ability to answer general questions, and perform actions with third-party hardware and services. Cloud based HomeKit skills are a natural next step I would like to see so more smart home gadgets will work with Siri. Few do at the moment so smart home equipment physically disabled users rely on to get things done at home just can’t be controlled by Siri.

Better and more accurate control by Siri will require smarter software such as that demonstrated by Google and its next-generation assistant on the Pixel 4. In comparison to what Google offers Siri is slow and inaccurate that most people don’t bother using it.

A continued conversation feature would really improve accessibility in Siri reducing the need for repeated button presses, and using the wake word continuously.

With macOS 10.16 I want Apple to recognise Siri’s huge accessibility potential and embed it more into the Mac than ever before. I want an all new Siri that offers a whole new digital assistant experience on the Mac. One that is faster, smarter, better understands both your words and your intent, and is more proactive about doing things on your behalf. I want Siri to be pervasive across the Mac computer.

Your macOS 10.16 wishlist

If Apple can bring these accessibility features to macOS 10.16 later this year they will offer a much more enhanced voice experience and this will widen access for everyone. But until the company does there are huge gaps in voice control and accessibility features on the Mac, which are really curtailing independence.

I’m not pretending any of this stuff is easy, and I am sure some features will need to wait for hardware developments. However, I dream of the day when I can ride up to my MacBook Pro in my wheelchair, the screen will unlock and I will be logged in automatically as I look at it, and Voice Control will take over and very accurately navigate as I request, and dictate messages and emails as I speak, without the need to touch the keyboard or trackpad.

I’ll have some more suggestions in the coming months but, for now, I want to hear your suggestions. When it comes to accessibility what do you want to see in macOS 10.16 in 2020?

Colin Hughes is a former BBC producer who campaigns for greater access and affordability of technology for disabled people

Leave a Reply