Home » Technology » Best features from Apple WWDC that will never come

Share This Post

Technology

Best features from Apple WWDC that will never come

Apple’s Worldwide Developer Conference kicks off in San Jose, California on the 3rd of June. With less than a week to go Colin Hughes has a wish list of accessibility features he would like to see on Apple’s platforms later this year.

Apple WWDC 2019

Every year Apple uses its Worldwide Developers Conference to showcase its new software and technologies for developers.

The conference opening keynote is where the tech giant’s top executives preview the company’s upcoming operating systems. iOS 13, macOS 10.15, tvOS 13, watchOS 6, are expected to be unveiled. Apple usually releases developer betas of its new iPhone, iPad, Mac, Apple Watch, and Apple TV software the same day, followed by public beta versions a few weeks later and final releases to consumers in September.

With WWDC 2019 only a few days away the internet is alight with rumours and predictions of what we’ll see — or what people would like to see during the keynote. However, amongst all the increasingly fevered speculation about Dark Mode for iOS 13, Safari and Mail upgrades, undo gesture, volume HUD, and more, you will probably not read very much about new accessibility features for physically disabled consumers. Apple’s executives, including CEO Tim Cook, are unlikely to talk about them when they take to the stage either so here are the ones that are at the top of my wish list.

As I am severely physically disabled, and cannot use my hands to control my Apple devices in many situations, my wish list focuses on features that will be helpful to people with physical and motor disabilities.

iOS 13

With over a billion active devices worldwide, iOS is by far the most popular thing Apple makes, and one of the most popular consumer operating systems in the world. It is undoubtedly used by a lot of physically disabled people too.

The first thing I would like to see in iOS 13 is a new section within Settings – General – Accessibility – Interaction on all mobile iOS, watchOS, and macOS devices that is devoted to people with physical disabilities who want to control their iPhone, iPad, MacBook and Watch by voice commands.

For a long time Apple has prioritised Switch Control as its main accessibility offering to people with physical and motor disabilities to help them operate their devices, and voice control and Siri are relegated in importance in this context. I would like Apple to change its focus and for Siri and voice control to be at the heart of what it offers physically disabled people who can’t use their hands to control their mobile devices. Switch control is an important accessibility feature for some but for physically disabled people who have use of their voice good quality and pervasive voice control is now a much better option given the increasing power and accuracy of voice technology.

I would like to be able to use Siri to hang up a phone call by a simple voice command such as “end call” or “hang up”. At the moment Apple’s voice assistant cannot help me do this fundamental. This has got me into all sorts of bother when calling someone and ending up in their voicemail box because they did not pick up my call. There is nothing I can do in such a situation to end the call so I have to pad out long voice messages until the recipient’s voicemail box has run out of space and the line goes dead. How crazy is that. I should just be able to give Siri a “end call” command like you can on Alexa smart speakers, and even Apple’s own Homepod.

I would like to see iOS 13 make it possible for me to turn on the auto answer accessibility feature by a voice command so I can control whether incoming phone calls are answered automatically without the need to pick up the phone and hit the answer button: “Hey Siri turn on/turn off auto answer”. Soon after writing about how iOS 10 did not include an auto answer feature, which made it difficult for me to answer phone calls, Apple introduced auto answer in iOS 11, which I find useful in conjunction with my Airpods. However, it is beyond ironic that Apple does not provide the option of toggling auto answer on and off via Siri and a voice command. This would be helpful because there are times, such as when I am in a meeting, when I do not need auto answer to be on. Siri can help with switching other phone functions on and off such the torch or bluetooth with a voice command but the thing I rely on most to make my iPhone or watch accessible isn’t an option at the moment.

I can’t unlock my iphone to access my messages in many situations so I would like Siri to read them out to me. It’s possible to use Siri hands-free from your lock screen for reading and managing messages, and many other things, but it comes with compromises. For privacy reasons Apple prevents Siri reading messages aloud unless you have notification previews set to on, which means all my private messages will be on display on the lock screen of my iPhone, which is not something I want. This is because Siri currently works by voice recognition not individual voice authentication.

I would like to see Apple bring voice authentication to Siri in iOS 13, which could even pave the way for Apple Pay payments by voice authentication, a feature I would find incredibly useful (more on this later in this article). Until this happens, Apple could offer a custom setting in accessibility options for those of us who want the privacy of not displaying our messages in full on the lock screen but still want Siri to read and manage our messages when the iPhone is locked

Too often Apple sees voice control as a gimmick it can demo at a glitzy product launch, or for offering mainstream consumers some added convenience like buying a Starbucks coffee on the way to work with a voice command to Siri, but for me voice control should be at the heart of what it offers physically disabled consumers who have the power of speech. What is convenience to many is independence to me.

Google is most definitely heading in the right direction with its next-generation Google Assistant, which it highlighted at its I/O 2019 conference earlier this month.

Running on-device, and coming to new Pixel phones later this year, Google says the next generation Google Assistant can understand and process your requests up to 10 times faster, making operating your phone, multi-tasking and even composing email easier than ever by voice commands. You can multitask across apps—so creating a calendar invite, finding and sharing a photo with your friends, or dictating an email is faster than ever before.

It’s no secret that Siri is way behind other voice assistants like the Google Assistant and I don’t see Siri getting anywhere near this level of performance and sophistication in iOS 13 but this is what Apple should be aiming for. I can see quite a lot of people skipping the next iPhone to be released in September and waiting instead for the new Google Pixel 4 phone with the next generation Google assistant on board, which is set for release in October. It is something I would certainly consider if there are not radical improvements in what Siri can offer people in my situation.

macOS 10.15

While it promises to be a really big year for macOS, with apps that you see on iPhones coming to the Mac, voice control is still very limited on the macOS platform. For example, Siri on the Mac hasn’t got the ability to set alarms and timers yet.

High quality and accurate voice dictation on the Mac is vital for me in being able to retain my independence and carry on with daily tasks and correspondence but macOS falls short in helping me correspond with my voice.

The main way I have been accessing applications like Mail, Messages and Safari on my MacBook Pro has been with Dragon for Mac speech recognition software. By converting my spoken words into text on my Mac it helps me write anything from this article to a text message to my dad, or a Facebook update.

Last October was a devastating month when the software developer Nuance discontinued its Dragon Professional for Mac product. It was rumoured at the time that the reason Nuance left the Mac platform was because Apple’s accessibility API restrictions left it unable to implement some of the features it was able to offer for the Windows version of Dragon.

Apple is notorious for not playing well with other developers with the strict limits it places with its APIs. Entire groups of Apple accessibility apps have been known to be suddenly wiped out because of API changes by Apple. Rather than stifling development, I think Apple should allow a select number of developers of significant accessibility solutions not supported by Apple – enhanced access to APIs.

It would be one thing if the other options for Mac users could match Nuance’s Dragon software but Apple’s own voice dictation app is nowhere near as good because it can’t cope with foreign names and work jargon. You can’t train it to recognise words so it doesn’t repeat the same recognition mistake, and you can’t edit its vocabulary. So, if there is an error in recognition when dictating, people like me can’t take to the keyboard and simply carry on. Using it is a frustrating and wholly unproductive experience.

We can only hope a solution can be found in macOS 10.15 as this sorry situation harms lots of people and puts us in a far worse, less productive, place. Technology is meant to do the opposite.

I believe, following the debacle that is Nuance quitting the Mac platform, Apple now has a responsibility with the release of macOS 10.15 to significantly improve its own dictation application to make it truly useful especially for physically disabled consumers who don’t have a plan B when it comes to correspondence.

watchOS 6

Last month I wrote about how the inclusion of voice-activated hands-free hey Siri in the newly released second-generation Airpods has enabled me to fully access my Apple Watch for the first time. Where previously I had to be able to physically raise my hand to press an Airpod to trigger Siri into action (something I cannot do) now completely hands-free I can make phone calls, send messages, play music, and much more on my Apple Watch by simple voice commands.

In practical terms this means I can leave home without my iPhone, (I couldn’t pick it up anyway), and with just the Apple Watch on my wrist, and Airpods in my ears, I can stay connected, which has many benefits including safety, and getting things done.

However, I would like to see watchOS 6 include auto answer so when someone calls me when I am out with just my Apple Watch cellular on my wrist the call can be automatically answered on my Airpods. At the moment this is not possible, which is very limiting. I would also like to be able to turn auto answer on and off by a Siri voice command.

In general, the accessibility options in the Apple Watch for people with physical disabilities are very limited. There is only one, which is a work out application for people in manual wheelchairs they are able to push with their arms. I am an electric wheelchair user so this application is of little use to me. I would like Apple to give much more thought to what it can offer physically disabled people who wear an Apple Watch in watchOS 6. Wearable technology, if it is accessible, has so much potential for people like me in terms of independence, healthcare, and personal safety.

HomeKit

It will be interesting to see if HomeKit and the Siri powered smart home gets any attention at WWDC.

HomeKit is software by Apple that lets users of smart home appliances set up their iOS device to configure, communicate with, and control smart home devices. You can ask Siri to turn off the lights from your iPhone. See who’s at the front door on your iPad. Adjust your living room temperature from your Mac. Even tell your HomePod to turn up the music.

The problem is Amazon Alexa and Google Home can do all these things too and are compatible with thousands more smart devices in the home offering consumers much more choice over which brand of smart lock, thermostat, television or smart plug they can control by voice.

I have had to build my smart home on Amazon’s platform because so many of my smart devices are not compatible with HomeKit. As a long standing Apple customer this pains me, and makes me question whether Apple is still interested in competing in this corner of the market.

When an appliance works with HomeKit it works extremely well. It is reliable, fast and has secure integration. Whilst privacy and security of smart homes is very important I don’t think this should slow the expansion of HomeKit. Sometimes you need to trade some security for a lot of accessibility so on my wish list is for Apple to bring more device manufacturers to HomeKit this year.

There are some signs of this happening with Apple dropping its requirement for companies to include a physical MFi security chip and instead adopted software-based authentication. That means more products are now able to offer HomeKit compatibility without changing their hardware.

Over the past year I have discovered from living in a smart home I have built myself, and taking part in a NHS trial, just how much independence voice controlled smart home technology can offer severely physically disabled people. It’s just a pity I am having to go to Amazon to make it all happen. Sadly, at the moment Apple is not providing many of the answers when it comes to voice controlled smart home technology for physically disabled people.

Apple Pay

I would like Apple to announce next week the release of new features to make Apple Pay more accessible. Apple Pay lets you make purchases conveniently and securely in shops, apps, and on the web using Safari. It lets you pay for goods by moving your iPhone or Apple Watch over a contactless reader, removing the need to use a physical debit or credit card or enter a PIN.

In many situations, I find it difficult accessing the iPhone screen to input a pass code or double click the side button. As a result I feel exiled from Apple Pay as a whole. I have it set up on my iPhone X and Apple Watch but I can’t make full use of it.

Due to access issues purchases in shops and restaurants have to be done by my carer using my debit card and with me having to hand over my PIN. There has to be a more accessible way for people like me to make payments through services like Apple Pay. I don’t think Apple and it’s banking partners have considered this issue enough though recently I have been invited by a major UK bank to trial banking through wearable technology, which is welcomed.

I don’t have the technological answers but surely the brains at Apple, and the banking world, could between them come up with a secure way of confirming a payment when using wearable and mobile devices other than having to physically press the side button, input a pass code, or raise one’s wrist on an Apple Watch. Individual voice profiles for Siri, for example, could use my unique voice via my Airpods to authenticate a purchase via a voice command like “Hey Siri, please pay”. This type of solution could work for me.

There are some grounds for optimism when it comes to Apple Pay becoming more accessible. iOS 12.3 allowed public transport users on the Portland Metro in the USA to pay for public transit just by tapping their iPhone or Watch; no Face ID or Touch ID authentication required.

Later this week, according to TechCrunch the New York City subway is adding Apple Pay with Express Transit for select subway lines and buses. Travellers will be able to just walk up to a turnstile and tap their iPhone or Apple Watch to pay and go.

At the moment Apple Pay transaction on an iPhone X, XR, or XS requires the user to double-click the side button to show the Apple Pay interface, look at the phone and wait for Face ID to authenticate, and then hold the iPhone near the merchant’s contactless reader to actually make the transaction take place.

With the trial on the New York City subway, all of that hassle goes away. You don’t need to double-click. Just tap the phone to the contactless terminal, and you’re done.

For Apple Watch, the user does not need to double-click the side button to begin the Apple Pay experience either. Again, just hold the watch near the reader — that’s it.

While the New York City subway deployment is still in a trial phase Apple’s own direction of travel with these updates to Apple Pay’s capabilities is positive but it is clear to me mainstream consumer convenience is driving these efforts, not accessibility considerations.

Empowering everyone

Apple does a lot to help users with disabilities related to vision, hearing, physical and motor skills, learning, and literacy with many accessibility features like VoiceOver, Live Listen, and Switch Control on products including the iPhone, iPad, Apple Watch, and HomePod.

The company often boasts that accessibility is at the core of what it does promoting inclusive design and emphasising technology that works for everyone. Despite these laudable claims I think the same old problem persists with Apple: it is too fixed on making existing products and services accessible rather than designing from the outset to meet the demand from all members of society, including people like me.

One-in-seven people around the world has some form of disability and I would imagine there are quite a few with physical and motor disabilities as the result of spinal injury, stroke and muscle weakening conditions. This is not a small market so Apple needs to recognise this and understand it has a social responsibility to build accessibility inside its products and operating systems from the ground up.

The company also needs to give developers the access and tools to allow them to build great accessibility features into their applications. Without adequate access, as we have seen with Nuance, developers simply walk away from the Apple platform.

Accessibility should be for everyone, and software that can help disabled people should not be relegated to a product launch gimmick. Instead, software and devices should be designed in such a way that users of all abilities can benefit.

The frustrating thing at the moment is that the technology is already here, voice recognition in particular has matured, but the onus now is on Apple to put the pieces of the jigsaw into place so it can start offering physically disabled people truly holistic voice solutions on all its platforms whether at home, or when out and about using mobile devices.

While there are a lot of rumours and speculation as to what we’ll see at WWDC next week, these are the features I really hope will make the cut. What are yours? Let me know in the comments.

Colin Hughes is a former BBC producer who campaigns for greater access and affordability of technology for disabled people

Leave a Reply