It’s rare to see people get excited about accessibility, but inside Apple’s Developer Academy in Naples, the sense of enthusiasm is palpable.
Sandwiched between weathered buildings in the suburb of San Giovanni a Teduccio, a neighbourhood that’s been nicknamed “the Bronx”, are a fresh cohort of more than 250 future iOS developers, eager and hungry to learn and build apps that make a positive impact on the world.
The decked-out halls in the University of Naples Federico II are disparate from the surrounding locality. Like a little Silicon Valley hidden away inside Napoli, you’ll find giant screens hooked up to Apple TVs everywhere you look, making it easy to AirPlay content from iPhones and MacBooks in a snap.
Tall sofas encase round tables in pods, providing groups of people with a sense of privacy, simultaneously forcing a sense of intimacy between the students. Then there’s the deliberate positioning of seats around a central column in a seminar room, all arranged in a way that’s supposed to make speakers feel uncomfortable.
In a workshop room, several groups of students are stood, chatting animatedly around large tables, space grey MacBooks and iPads in hand. They’ve just been set a challenge: dissect each other’s apps and weed out any potential issues they might find when it comes to usability and accessibility.
The school in Southern Italy has been a springboard for up-and-coming developers ever since it was launched in 2016. The first in Europe, its intention is to give the next generation of programmers the tools they need to code, design, pitch and market apps for the App Store.
Now in its seventh year, the students – or learners, as Apple affectionately calls them – are just over a month into their year-long app development course, and they’ve already started learning about the importance of designing in a universally accessible way.
As a term, accessibility can be a little nebulous, but it boils down to making something usable by everyone, even if they’ve got a disability. Whether it’s something physical in the real-world – tactile pavements, talking ATMs, buildings – or increasingly commonly the digital world, like apps, websites, games or software, it’s about making sure everyone can use a product without any barriers or limitations.
Sometimes seen as boring or an after-thought, the “A” word can strike fear into the hearts of even the most established developers. But at the academy, Apple’s students are being taught to put accessibility first, from the moment they put pen to paper and start writing a line of code.
Just another layer of compliance to some, it’s one of Apple’s core values and a principal that the company has been championing for years.
Apple opened its first office of disability in 1985, five years before the Americans with Disabilities Act came into place.
But it really changed the game in 2009 when it ported VoiceOver onto the iPhone 3GS – a text-to-speech engine that worked on a touch-screen smartphone. It was revolutionary at the time because it allowed people with low vision to use a phone without a keyboard or dial pad, at a time when many blind people thought that making a touch-screen accessible was frankly impossible.
After the App Store rolled out, Apple also made sure that its burgeoning ecosystem of third-party developers had all the tools they needed to make their apps usable by as many people as possible. Apple’s students are taught how to use these tools right from the beginning of their one-year intensive course, instilling in them an immediate sense of importance around universal design.
Apple’s software development kit, which helps developers programme apps in Swift and Objective-C, does as much of the heavy lifting as possible when it comes to implementing accessibility elements into their apps. As an example, it bakes the accessibility inspector directly into XCode, the company’s software suite.
For Apple, that’s where it starts. “In downloading that software developer kit and understanding all of the components that Apple provides for making a great app, accessibility is built-in,” says Sarah Herrlinger, Apple’s global head of accessibility. “When our developer relations team talk to developers, [accessibility] is a big piece of the discussion of how you take an app from good to great,” she tells The Independent.
Back in the workshop room, students are scrutinising every detail of each other’s apps, interrogating all the accessibility elements: Are images and videos accessible? Do colours properly contrast for those with colour blindness? Why doesn’t an app work correctly with VoiceOver?
The students in the workshop here are all graduates who have gone through the academy process previously, but have been invited back for another year to help work on developing apps for non-government organisations. It’s real work, where they’ll help to solve real-world problems.
There are now over a dozen Apple developer academies across the globe. The company opened schools in South Korea and Detroit in 2021. At each hub, accessibility takes centre stage in the first few months of the course.
While not every student on the academy goes on to develop an app, several of them have been focused around making the world more accessible for people with disabilities. Roughly 200 apps have been developed since the academy first launched in Naples in 2016. Today, some of the alumni are demonstrating their apps to The Independent.
There’s an app called Hear Me Well. It uses your iPhone’s microphone to amplify the sounds and voices around you, piping that boosted sound through a pair of headphones or earbuds. It’s potentially useful for those with reduced hearing, whether they’re in a crowded bar or home in front of the TV.
There’s TruSteppy, an app that takes advantage of the iPhone’s TrueDepth front-facing camera to help people with low vision detect obstacles in front of them, vibrating when there’s an obstacle in the way.
Then there’s Dusk. This one’s not an accessibility aid, but an app packed with a series of mini games that are fully accessible for the blind, but can be played by anyone. Apple earlier this year released an open-source accessibility plugin that would make it easier for developers coding in Unity to make their games accessible.
These aren’t always features technically intended to be used for accessibility purposes, but the graduates have broken them apart and thought about them in interesting ways. The TrueDepth camera wasn’t intended to be turned outward and used to detect obstacles, for example, but the developer has repurposed the camera’s ability to build depth maps and capture infrared images to do just that.
While Apple implemented Live Listen in 2018, which essentially does the same as Hear Me Well, and implemented a “conversation boost” accessibility feature into the AirPods in 2021, which uses the beam-forming microphones on the AirPods to boost the voice of the person in front of you, Hear Me Well works with all headphones, and can conduct an audio metric test inside the app.
An iOS feature might not have been intended to have an accessibility component attached to it when the feature was first cooked up, but those boundaries are being pushed by Apple’s grads, and accessibility is being thought about in innovative ways.
VoiceOver and beyond
The integration of VoiceOver in the iPhone in 2009 was just the start. Apple’s list of accessibility features is constantly growing. It’s easy to spend weeks scrolling through the accessibility settings, playing with all the features, discovering all the ways you can use its suite of products, from an iPhone to an Apple Watch, and still only ever scratch the surface.
There are features for people with physical and motor difficulties. Switch controls and voice control features help people use their iPhone with a separate peripheral, or by simply talking out loud. Another feature alerts hard of hearing users through their iPhone if it recognises the sound of a fire alarm, running water or a baby crying, for instance.
Some accessibility features have also been adopted by non-disabled people. Back in the pre-iPhone X days, when iPhones had a home button, it was common for users to turn on Assistive Touch if their home button broke, because it allowed them to jump back to the home screen without having to click the physical button.
In iOS 14, Apple introduced another physical accommodation accessibility feature called Back Tap. Intended to help people with motor difficulties, it was quickly embraced by non-disabled individuals, making it easy to quickly launch complicated shortcuts, open apps or turn on specific settings.
Apple has had a hidden magnifier app inside the Settings app for some time, but it rolled out a dedicated app, installed as default on all iPhone devices in 2021.
When the iPhone 12 Pro launched with a LIDAR sensor in 2020, the company brought out a new feature called People Detection. It uses the People Occlusion feature in Apple’s ARKit in combination with AI, VoiceOver and the iPhone’s LiDAR sensor to identify the distance between the user and the person in front, simultaneously offering up a description of what that person looks like.
This year, the company released a Door Detection feature. It helps users detect the distance between them and the door, telling them whether it’s open or closed, and whether it’s a push, pull or a twist of the knob. It’s also recently launched a live captions feature. Still technically in beta, it transcribes spoken and on-device audio in real-time. Plus, there are new features for controlling your Apple Watch using gestures or remotely via a paired iPhone.
When Apple launched the People Detection feature, it also created APIs for developers so that they could make use of the feature inside their own apps, taking advantage of the new LIDAR functionality on the iPhone. AR is a promising area in the tech world, but it has yet to be fully exploited when it comes to accessibility, and right now, people still have to tether themselves to their phones to take advantage of the tech.
“I think we will have to see what AR brings given the form factors we have available today. Being able to take advantage of LIDAR has been a real gamechanger for the blind community, but I think time will tell where LIDAR goes, whether that’s people who would ask to put into other models or other things, so I think we’ll see,” says Herrlinger.
She notes that she has already seen interesting developments in augmented reality for those on the autism spectrum, and imagines virtual reality implementations where wheelchair users, who might never have the chance to scuba dive, can swim with blue whales. “I think the sky’s the limit at the end of the day on technology.”
Nothing about us without us
There is a phrase popular in disability rights activism circles: nothing about us without us. In the context of disability, it means that whenever something is made for disabled people, it should have the input of disabled people. Without it, and still all too often, projects score media coverage, assistive tech devices are reinvented over and over, and are sometimes unmasked as mere vapourware.
Over the years, non-disabled people have attempted to make assistive aids for disabled people, but have been criticised for not consulting disabled people when developing those aids. “One thing that’s pretty recurrent to me are apps that facilitate interactions for sign language. Those videos go viral on Instagram and LinkedIn,” remarks Olivier Jeannel, who is profoundly deaf and the founder of Rogervoice, an app that helps facilitate phone calls for people who are hard of hearing. Rogervoice provides closed-captions conversations on the fly, empowering deaf people to have conversations over the phone.
“There are very talented developers working on sign language recognition, but they’re not for deaf people. And it’s pretty obvious that they haven’t consulted the deaf community on the topic,” Jeannel adds.
When he founded Rogervoice through a Kickstarter campaign in 2014, he, of course, was his own target market, but he also contacted his local association of deaf and hard of hearing people, who put him in touch with others to provide feedback on his prototypes. “That was key,” he says. “Apps should always have the end user in mind. It’s not at the start, it’s not at the end, it’s throughout the lifecycle of an app.”
While it’s unclear if there are any students who identify with a disability in this year’s cohort of Apple Academy developers, the students are encouraged to work and engage with people who might become users of their apps when developing their ideas.
The academy has close ties to local NGOs, for example, with students working on apps designed for the blind in collaboration with the Italian Union of Blind and Partially sighted people.
Ultimately, for Apple, accessibility isn’t something that’s bolted on to the back of a product, hoping it ticks all the usability checkboxes required by law. At the academy, it’s clear that the design principle runs through the company’s veins.
Accessibility isn’t something that Apple has to do; accessibility is something the company believes it should do. “Our philosophy as a company is that accessibility is a basic human right,” Herrlinger affirms. “For me, it’s not so much about it being inspirational as it is something that I believe we should all be championing for – internally and externally – as much as we can.”
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies