Home » Reviews » Apple’s trillion dollar amnesia

Share This Post

Reviews

Apple’s trillion dollar amnesia

It's been a busy time for Apple of late as it released new iPhones, Watches, an iPad, MacBook and operating systems. But in its rush to preserve its top spot in the mobile device market has the trillion dollar tech giant contracted a dose of amnesia when it comes to accessibility features for its disabled consumers? Colin Hughes investigates.

Apple accessibility

On 17 May Global Accessibility Awareness Day Apple marked the occasion by highlighting its accessibility credentials on its apple.com homepage. Under the heading it proudly announced, “technology is most powerful when it empowers everyone,” with a link pointing to the company’s accessibility site, introduced by a film.

Everyone? Well, not me. I’m not feeling very empowered by Apple’s latest offerings released a few weeks ago in mid-September, at least when it comes to improvements to accessibility for people like me who have physical disabilities.

Let’s get one thing out of the way. Apple, to its credit, does offer a range of accessibility features for those with physical, sight, hearing, and learning impairments. Arguably, these features are better than those offered by its competitors such as Google and Microsoft.

I know little of how well they work for people with sight, hearing, and learning impairments. I’m writing from the perspective of someone with a physical impairment, namely muscular dystrophy which causes severe muscle weakness in my arms and hands rendering me quadriplegic and unable to easily access my iPhone screen, MacBook keyboard, or Apple Watch face in many situations.

I share Apple’s aspiration to do much more by voice activation on the gadgets I currently own but the tech giant seems to be ignoring the potential of voice commands and control for accessibility purposes.

At the moment, with the iPhone and iPad, Apple provides three main ways to access these iOS platform devices if you have a physical disability – touch, voice (Siri) and switch control. If, like me, you have little or no functional arm/hand movement then these are the options and they haven’t changed much in several years.

Siri can do some useful things, but it is still limited e.g. messaging works, phone calls partly work, (you cannot answer the phone or hang up using your voice). There are no options in Siri to control the Books or Kindle apps, although Siri can partially control third party apps like Skype and Whatsapp.

Apple doesn’t think of Siri in terms of accessibility. It is too busy pushing Siri as a mainstream feature for everyone. At its flashy product launches and hardware demos the accessibility benefits of speech recognition seem to get tossed aside. Instead, it pitches voice control in terms of what can arguably be called gimmicky things like ordering your coffee from Starbucks while turning your lights off on your way out to work.

In my opinion there should be a section within Settings – General – Accessibility – Interaction on all mobile iOS and macOS devices devoted to people with physical disabilities who need to control their iPhones, iPads, MacBooks and Watches by speech commands.

Switch control is an accessibility feature in iOS that allows access to a lot of things. It requires a specialist external switch and adapter for it to work and there are numerous types of switch available, (many of which Apple sell), so finding an option that fits is usually possible. I don’t find switch control helpful as I often can’t reach a switch, and it does feel quite old-fashioned technology. There is also the added cost of having to buy the hardware and sometimes pay specialist third-party developers to set things up.

It is also possible to use a stylus in your mouth to touch the screen of an iPhone or iPad as you would with your hand/finger. It requires quite a lot of skill and careful mounting of your device.

Face ID came to the iPhone X in 2017 and uses facial recognition technology to allow you to use your face to unlock, log in and pay for things. To its credit Apple has included an accessibility feature within Face ID so if, for example, like me you have to wear a ventilator mask over your face at some points during the day Face ID has a clever way of recognising you with the mask on, or the mask off.

macOS has a range of accessibility features built into Mac computers and laptops including help with dictation, Siri, keyboard and mouse control, to name but a few.

Against this background I have been taking a look at Apple’s latest accessibility offerings released a few weeks ago.

iOS 12

Whilst Apple’s latest mobile iOS 12 update focuses on making things work, instead of adding new features, Siri’s new Shortcuts app is a new standout feature in iOS 12 that allows iPhone and Apple Watch users to use Siri to step through multistep routines. Shortcuts is designed to allow you to create custom commands in Siri that launch apps or combine a number of actions.

Amazon Echo has something similar and calls them ‘Routines’. Say “Good morning” to Alexa and she can give you your news update, the weather, the state of traffic on your commute and then boil your smart kettle for your morning coffee – all with one easy command. Now with Shortcuts, Siri does the same.

If you have a physical disability fatigue can be a problem so this convenience can be very helpful, which widens your use of technology.

When Siri Shortcuts was announced in June it was rumoured to give people with impairments a real boost in terms of accessing their Apple devices. However, now it has been released, I can’t find anything specific in terms of accessibility shortcuts and actions within the app. Look in the gallery of pre-installed shortcuts, which Apple provides to get you started, and there is not a single one related to accessibility.

As someone who has great difficulty accessing the iPhone screen with my hands I would like to ask Siri, via a custom Shortcut, to switch Auto Answer on and off so incoming calls are automatically answered by my iPhone. After all, what is the point in having this feature if you need to use your hands to switch it on and off in the first place. Auto Answer can be found in Settings > General > Accessibility > Call Audio Routing > Auto-Answer Calls. I am disappointed this simple action is not yet possible. How could Apple completely ignore those with physical access issues when Shortcuts has the potential to be such a liberator?

As I write this article Apple has belatedly gone on record as saying it sees “huge accessibility potential” for Shortcuts in iOS 12. In a statement Senior Director of Global Accessibility Policy & Initiatives, Sarah Herrlinger, spoke about the accessibility benefits of Shortcuts  She explained the company is receiving feedback from users on how they’re using Shortcuts to combine multiple tasks into one for accessibility benefit:

”It’s already making a difference — helping people across a wide range of assistive needs simplify every-day tasks like getting to work, coming home, or staying in touch with friends and family.

We’re getting great feedback about how powerful the technology is in streamlining frequent tasks and integrating multiple app functions with just a single voice command or tap.”, she said.

Apple really is a bit late to the party with this but these comments are perhaps grounds for some optimism. Whilst it did nothing to signpost the benefits of Shortcuts to physically disabled people at launch I don’t doubt they are proving of some benefit. I am one of those people that have been feeding back to Apple in the last few weeks expressing disappointment about the lack of accessibility specific features in the Shortcuts app at the launch of iOS 12. If Apple was ahead of the game in this area, there would not be this ridiculous disconnect and accessibility related shortcuts would have been available at launch. Instead of demonstrating a somewhat gimmicky demonstration of Shortcuts at their iOS12 preview in June Apple could have shown how the app can transform the life of a disabled user. How inspiring would that have been to the gathered masses who watch these evangelical launch events.

watchOS 5

Last year I reviewed Watch series 3 and watch OS 4 and revealed how the need to physically raise or twist one’s wrist to wake the Apple Watch face in order to activate Siri to get things done to all intents and purposes cut me off me from accessing the Watch.

Somewhat painfully, I have come to realise that fixing an Apple Watch to my wrist is akin to fixing it to an inert slab of meat. It does practically nothing for me because my body cannot initiate sufficient physical actions to stimulate the watch into action.

But it doesn’t need to be like this because as a device the Apple Watch has so much potential to be of tremendous help to someone in my position.  It is simply the case that Apple has failed to think about hardly any accessibility features for physically disabled people for the Apple Watch apart from a wheelchair work out activity app.

Last year I decided to keep the Watch in the hope that Apple would come up with a solution this September. As the year went by rumours of new AirPods 2, (Apple’s popular wireless Bluetooth earbuds), with a dedicated chip for Siri hands-free activation gave me hope that I would finally be able to wake the watch face by a voice command and take full control of my Apple Watch for the first time. Hands-free Siri activation isn’t currently a feature on the first-generation AirPods. No more futile attempts to raise my wrist, or tap the input, to get a reaction from Siri, is what I thought.

My hopes were raised even further when Apple appeared to tease the feature at its September 12 iPhone launch event.

The opening video that it used to kick off the event showed a woman wearing AirPods. Stopping in front of a pond, she says, “Hey Siri,” but significantly doesn’t tap either AirPod to activate Apple’s voice assistant.

As the launch event unfolded I thought to myself is Apple teasing a new a pair of AirPods this year with the much needed hands-free ‘Hey Siri” feature I have been waiting for? Sadly, a new version of Airpods did not materialise and watchOS 5 remains as inaccessible to me today as watch OS 4 did last year.  I can’t put into words how disappointed and deflated I felt that day. To be teased and eventually let down by Apple CEO Tim Cook’s gushing and gimmicky presentation felt especially cruel.

Having checked out watchOS 5 in some detail Apple has come up with no accessibility features in its latest version of watchOS for people with physical disabilities.

In terms of hardware I haven’t had the opportunity to try a new Apple Watch Series 4 but it does come with a fall detection feature which if it detects a hard fall, it can help connect you to emergency services if needed. This could be useful for anybody with mobility issues and is to be welcomed. I just wish Apple would take this kind of thinking a lot further in it’s development of the watch.

Superman

Christopher Reeve SupermanIn trying to assess the efficacy of Apple’s accessibility features for people with physical disabilities, and to explain to people who don’t have experience of disability, I apply what I call the “Christopher Reeve test”. The story of the Superman actor who was paralysed from the neck down in a riding accident is still well known to many but dying in 2004 Reeve missed the era of smartphones and watches. I often ask myself, if he were alive today and, for example, I put an Apple Watch on his wrist what use could Christopher Reeve make of it? Like me, his arms and hands did not work after his accident, and the answer that comes back to me time and again is nothing. In designing its devices Apple should set the bar high and apply the Christopher Reeve test.

macOS Mojave

Things don’t look much brighter with the release of macOS Mojave the new operating system that runs on Mac computers and laptops.

The main way I access macOS on my MacBook Pro laptop computer is with Dragon for Mac speech recognition software. It helps me write anything from this article to a text message to my mum. This past week I have been crushed by developer Nuance’s decision to discontinue Dragon Professional for Mac.

Nuance’s Dragon software is useful to everyone from lawyers and home users to doctors as a way to turn spoken words into printed text. However, it is much more than a convenience to me. I am completely reliant on voice dictation software for corresponding. I do not have a plan B for writing anything.

Nuance’s announcement that it is discontinuing the Mac version of Dragon, has put me and many others a difficult position. While the software will continue to work, there will be no future updates, meaning I will need to find other ways to get everyday tasks, most people take for granted, done.

I have a follower on Twitter with a disability who tweeted this reaction to the news:

Patrick is legally blind, has cerebral palsy (with unaffected speech), and is a speech recognition expert. He said: “I cannot rely on discontinued assistive technology in my job. In the long-term I will need to switch to Windows as my desktop computing platform for work. At best this will affect my productivity in my current job, making me slower, at worst I will have to change jobs. It’s possible Apple will do something to fill the gap left by Nuance. But that’s going to take a long time; possibly years. I’m sure I’m not the only one in this situation. There will be other users who have to switch to using Microsoft Windows and/or switch jobs.”

Neil Judd is a digital inclusion expert for Hands Free Computing who help people with assistive technology. He echoed what Patrick said about the importance of this software to productivity and employment: “Voice recognition on the Mac with the dictation app and Dragon software is vital in being able to retain independence and carry on with daily tasks. For some it has meant the difference in keeping their job or not, keeping up with targets and expectations”.

People with dyslexia and blind people are also likely to be affected adversely along with severely physically disabled people like myself.

Other options don’t really cut it for me. It would be one thing if the other options for Mac users could match Nuance’s Dragon software. Unfortunately, there isn’t anything close to Dragon at the moment. Apple’s own voice dictation app is not as good because it can’t cope with foreign names and work jargon. You can’t train it to recognise words so it doesn’t repeat the same recognition mistake, and you can’t edit its vocabulary. So, if there is an error in recognition when dictating, people like me can’t take to the keyboard and simply carry on.

That leaves me with the difficult choice of either making do with an inferior product or dropping my Mac in favour of a Windows computer running Nuance’s Dragon for Windows product. I am a Mac user, I am wired into the Mac ecosystem with MacBook Pro, iPhone, and Watch with all that close integration between devices, It is not a straightforward decision to switch to a Windows computer and get Dragon for Windows.

The writing was on the wall for me when I stumbled across a Youtube video of a 2016 user group presentation, where Nuance R&D program manager Jeff Leiman  rather candidly noted how Apple’s accessibility API restrictions left it unable to implement some of the features it was able to offer for the Windows version of Dragon. Curiously after this video received publicity in the technology press recently it has coincidentally just been removed from YouTube.

Frustratingly, the ability to do speech recognition properly is already here. Apple has made a special point of working it into their product showcases, but the involvement ends there, and when it can’t be used to show off a new product, speech recognition seems to get chucked aside.

If I was Tim Cook, the CEO of Apple, a major player such as Nuance walking away would set alarm bells ringing. I really hope a solution can be found as this harms lots of folk and puts us in a far worse place. Technology is meant to do the opposite.

Lunis Orcutt is a Nuance certified/licensed reseller and runs the KnowBrainer Speech Recognition Forums, which he calls the world’s most popular speech recognition forum. He said: “Nuance bailed out of supporting Mac because they couldn’t justify the R&D with enough sales”.

“The Mac OS is harder to develop for and only occupies 12% of the market where Windows owns 86% of the market. You might see a lot of Mac computers and movies and TV shows but in reality, this is much rarer than you might think. Most businesses and pretty much every part of the government uses Windows.” he added.

Peter Hamlin, a Rehabilitation Engineer, whose role is the appliance of assistive technology and specialist configurations of COTS (Commercial Off-The-Shelf) products to support those with severe disabilities in the health service, told me Apple is notorious for not playing well with other developers with the strict limits it places with its APIs. He said he has been aware of an entire group of Apple accessibility apps being suddenly “wiped out” because of API changes by Apple:

“Rather than having to put up with this nonsense, it is no wonder that many more developers choose to write software for Android and Windows (where both platforms go to great lengths to provide support for legacy apps) than the Apple platform”.

“Rather than stifling development, I think that Apple would be well advised to allow a select number of developers of significant solutions for Apple platforms – including those providing accessibility solutions not supported by Apple – enhanced access to Apple APIs”, he added.

I believe Apple now has a responsibility to develop their own voice recognition software on a par with Dragon, or allow developers of significant solutions for Apple platforms, enhanced access to Apple APIs.

Given Siri’s proven voice skills, you’d think speech recognition would take centre stage in macOS. If Apple truly believes in productivity the future of voice control on your Mac computer probably isn’t using Siri to launch a movie to watch on TV. It’s writing about the film experience—but with your voice rather than with your fingers on a keyboard.

Neil Judd said: “Steps are being made to make devices accessible via voice control. The recent boom in virtual assistants and devices such as Amazon Alexa, Google Home, Siri and Apple HomePod show the demand. However, as great as these devices and functionality are, they are not necessarily aimed at the accessibility market. They are marketed more as a fun entertainment gimmick, whereas for those physically impaired they really are a lifesaver, giving back independence and wellbeing.”

I think it’s pretty obvious that Apple has the ability, technically, to create its own impressive speech recognition application. It has the massive computational power of the cloud at its disposal and can crunch and correlate your voice input together with whatever other data Apple knows about you, generating the intelligence that is the heart of Siri.  Why it, and the other tech companies, have not done so thus far is a mystery to me.

As a direct result of the Nuance decision I am trying out Dragon Naturally Speaking 15 on Windows 10 this past few days. It is noticeably a more advanced and accurate speech recognition app than Dragon for Mac but ultimately it is swings and roundabouts. Dragon for Mac on macOS seems to work in more text boxes but without full text control for voice. Direct dictation in Dragon Nationality Speaking running on Windows 10 is only available in approximately 10% of
applications. Non-Dragon friendly applications typically require opening a dictation box, dictating, and then transferring your dictation to the target application whether that be Facebook, WhatsApp or Twitter, which is tedious and not at all productive.

Setting aside the developer Nuance for a moment, I think all the big tech companies should come up with a uniform system that allows full text control by voice wherever you have to input text by voice. This is really important stuff for people like me and I’m sure everyone agrees that communication should be a human right. At the moment it is a real mishmash between different applications, WhatsApp, Facebook, Twitter, as to which ones will support full text control by voice and those that don’t. Quite frankly I am tired of putting up with this crap.

Lunis Orcutt said: “Technically, Twitter, Facebook, and WhatsApp could’ve made their applications Dragon friendly but didn’t feel the need because they’re not in the speech recognition market.”

“Making Facebook etc. more Dragon friendly is much harder than you might think because these are HTML fields and Nuance chose not to support HTML, to save money. HTML is prettier and companies nearly always go with what looks best rather than what works best and that will probably never change.”, he added.

If you think about it in this day and age there are laws in the UK, US, and elsewhere that make businesses provide ramps and toilets to provide access to disabled people but for some reason big tech companies seem to get away with shirking their accessibility responsibilities. I’m beginning to think there needs to be a law that would require the likes of Facebook, Twitter, Apple, WhatsApp etc to provide full text control in their text input boxes so people can dictate into them efficiently and naturally by voice, if that is their only method of communication. Developers like Nuance can only do so much; it is up to the main players to play ball and do the right thing.

It may be that existing UK equalities legislation covers this issue – but the focus has always been on websites by mistake. The provision of software – either locally or online – is a service so should be covered. Interestingly, with software moving online to the SaaS model, the website accessibility issue becomes more and more relevant. I am not a lawyer but as the internet does not respect borders, and the sites/apps probably originate on servers in California or Iceland, I wonder if UK equality laws apply. Perhaps something could be done at EU level as a block (though Brexit complicates that now).

Accessibility is often used by big tech companies to blow their credentials but dig below the surface, and the marketing hype, and there are gaping holes. Apple, Microsoft, Google et al claim they are supposedly accessible, but you can’t control many areas of their platforms by voice. If some of the fault is with applications like Facebook, Whatsapp and Twitter they should throw them off their platforms until their apps are fully accessible.

Home control

In terms of controlling a smart home it is good to see Apple bringing its Home app to macOS Mojave on Mac computers, and making Siri always on and listening on some MacBooks is also to be welcomed. The more devices that have Siri on board available for voice commands the better as far as I’m concerned.

But while Siri has seen some major improvements in iOS 12, in terms usability and access, when it comes to controlling your smart home Apple and Siri are falling behind Google and Amazon in the smartphone race. I have written previously about how I had to build my own smart home with Amazon technology because Apple does not work with enough developers and device manufacturers.There are several devices I would like to control by Siri but unfortunately they are not yet compatible with HomeKit the software behind Apple’s Siri-powered smart home platform.

As I have discovered over the past year one of the main stumbling blocks if you are physically disabled and want to turn your home into a smart home to increase your independence is cost. Neil Judd explained: “At the moment, if you want to home automate your house via voice control it all seems very nice and exciting until you start totalling up how much all these devices will cost you. You have the controller device which maybe affordable such as the phone, Amazon Echo, Apple HomePod, Google Home, but then you must factor in smart light bulbs, thermostats, plugs, blinds, doors, kettles. If you are not careful this can run into thousands of pounds very easily, and who can readily afford that without funding?”

“Perhaps these devices should be made available via funding grants as standard and not seen as a home improvement. That would make a big difference”, he added.

The UK government announced in the budget last month that it will be bringing in a digital services tax on tech giants from 2020. I think instead of a crude revenue raising tax the big tech companies should be given the option of doing more to make their devices and software more accessible and affordable in exchange for certain tax breaks. More of a carrot than stick approach to get them to fall into line.

Lunis Orcutt produces a third party command utility, which allows you to run your computer completely hands-free, when included with Dragon. He offers his Knowbrainer command software free to anyone, worldwide, with any physical disability. Perhaps tech giants, awash with cash, could take inspiration from a smaller developer like Lunis.

At the very least tech giants like Apple could start by enabling severely disabled people to access their consumer devices at a discounted cost in much the same way as other consumers, such as students or charities can.

Government could also ensure all disabled people are given access to individual health budgets to give them the choice of purchasing the technology that best meets their needs, rather than that which is on the NHS approved providers list for medical devices.

Apple Pay

I’m disappointed that Apple didn’t release any new features to make Apple Pay more accessible to people like me this year.

Apple Pay lets you make purchases conveniently and securely in shops, apps, and on the web using Safari. It lets you pay for goods by moving your iPhone over a contactless reader, removing the need to use a physical debit or credit card or enter a PIN.

Does Apple think I don’t want to spend money in convenient ways in shops, restaurants, and online like everywhere else? Perhaps I am considered too poor to pick up the tab when I go out for dinner with friends?

There is a setting in accessibility to use Apple Pay, which is in Settings – General – Accessibility – Side Button – Use Passcode for Payments. It is helpful for people who can’t double press the side button. However, in many situations, I find it difficult accessing the screen to input a pass code. I feel exiled from Apple Pay as a whole. I have it set up on my iPhone X and Apple Watch Series 3 but I can’t make full use of it.

Has Apple offered an alternative option for people like me? No. I don’t have the technological answers but surely the brains at Apple, and the banking world, could  between them come up with a secure way of confirming a payment other than having to physically be able to press the side button, input a pass code, or raise one’s wrist on an Apple Watch? Biometrics should hold the answer – Face ID, or individual voice profiles for Siri so my unique voice via my Airpods could verify a purchase. “Hey Siri, please pay”, for example.

Banks are already using voice ID for verification purposes. Due to access issues day to day purchases in shops have to be done by my carer using my debit card and with me having to hand over my PIN. There has to be a more accessible way for people like me to make purchases through services like Apple Pay. I don’t think Apple and it’s banking partners have considered this issue enough. Tap and pay is helpful but the amounts are low £30 and anyway I want to use my mobile device as it is more secure.

Conclusion

Apple’s accessibility features, for people with physical disabilities particularly, are the same old, same old. Nothing really new or revolutionary this year but that has been the case for many years now. When it comes to improving accessibility for physically disabled users Apple remains stagnant.

It’s not only me who is frustrated. Visit the Apple forums and you will see a lot of frustration expressed by disabled users at amongst other things auto lock not working under the guided access accessibility feature following the upgrade to iOS 12.

Problems like this are not a bit of inconvenience; it can really affect people’s independence and ability to live and function as human beings. It is particularly frustrating that Apple does not seem to be listening. You can submit feature requests, and post in its forums, but there is little change as a result, and what changes that do come take a very long time.

Apple and all the tech companies need to take a more inclusive approach to the design of their software and hardware. It may even be worth their while trying out features on disabled users first because on many an occasion I’ve seen features that start out as being useful for disabled people gaining mainstream popularity and uptake.

At the moment it all feels tokenistic, with no joined up thinking, and accessibility features coming as an after thought. The Shortcuts app is a prime example of that; it has so much potential but has not been optimised for accessibility and disabled users who stand to gain most from it. The priority for Apple appears to be the gimmick it can show off at its flashy annual new iPhone launch event, and not helping people like me live more useful and productive lives.

I buy Apple products, I spend thousands of pounds on Apple products, and I’m not asking for charity. I’m simply asking tech companies like Apple to show greater awareness, and corporate social responsibility, for the benefit of everyone.

I am not expecting them to become medical device manufacturers but I am encouraging Apple and others to explore, design, and deliver mainstream consumer products in a much more inclusive way that meets the additional needs of disabled people.

For people like me, being able to control my Apple device by voice, effectively, can make or break my day; my life even. Apple just doesn’t seem to get that at the moment.

Accessibility features like voice activation open up a whole new world for physically disabled people. Google has recently published this inspiring video of what this technology can do to transform lives:

Accessibility is important because the simple things we all take for granted like being able to make a telephone call, answer a call, check the weather and notifications, or write a message are made available via voice control. But when it falls short it prevents or hinders my ability to keep in touch with family and friends. That is no small matter and I’m sure everyone can relate to that.

Neil Judd from Hands Free Computing sums up the importance: “Voice control of devices opens up the device to those physically impaired and allows them to interact as an individual on social media, organise their daily routines and gain access to the wealth of knowledge on the internet. How would you feel if you couldn’t post a Facebook update or tweet a response? In this online world, these things are important.”

As Apple shares sink as Christmas sales forecasts disappoint investors perhaps it could do with a hand and a few more sales from people like me.

Colin Hughes is a former BBC producer who campaigns for greater access and affordability of technology for disabled people

Leave a Reply