Australian technology news, reviews, and guides to help you

Pickr is an award-winning Australian technology news, reviews, and analysis website built to make technology easier for everyone. Find the latest gadget reviews, news, and more focused on the only ad-free tech site in Australia.

Australian technology news, reviews, and guides to help you

Apple showcases an accessible future across Mac, iPhone, iPad, Vision Pro

Not everyone can see their screen or hear the music. Fortunately, Apple has some accessibility features on the way to assist with both.

If you’ve ever wondered whether using a computer or phone or tablet or TV is the same for everyone else, you only need to glance at some of the research being done into the field of accessibility.

A category growing every year, particularly as technology aims to become more inclusive overall, devices and software experiences are improving to make what they create a little bit easier for everyone to use.

While many comfortably use a keyboard and mouse, and can read off a screen relatively easily, many do not, and using a phone or computer can be as difficult as navigating life with your hands tied behind your back.

With that in mind, a company that has had an accessibility officer almost since its inception is this year showing not just how its own devices are improving accessibility when using them, but also how they can help with real world activities.

For instance, Apple will be rolling out a new accessibility reader across its devices, allowing apps and the operating system to change font, colour, spacing, and even support spoken word.

It includes a way to zoom into text seen on items using the Vision Pro, almost like you were using special glasses inside the headers, as well as a way for the Vision Pro to explain surroundings when the rear cameras are switched on. That idea almost gives wearers an AI assistant to simply explain what’s around them, using machine learning to look at the surroundings and detail what’s nearby.

Apple is also expanding captions to the Apple Watch, turning the wearable into a real-time caption reader, able to be used in conjunction with the hearing aid feature found in the AirPods Pro 2 (which only switches on if your hearing loss is severe enough to warrant it).

And there are other features, such as how eye tracking on the iPhone and iPad can now be used to make selections by simple staring (dwelling) on an item, while head tracking more easily controls aspects of an iPhone and iPad using head movements.

Vibrations to explain musical parts are also improving with Music Haptics, allowing users to tweak the intensity and whether it applies for the whole song or vocals, something that likely comes out of Apple’s karaoke feature research that can turn down the vocals in a song, while a Sound Recognition feature now supports names, allowing people who might be hard of hearing to know when their name is being called.

And Apple’s Personal Voice has also improved, giving people who can’t speak a way to using their real voice and only a sample of recordings. It’s not quite to the level of Eleven Labs’ system that we’ve shown in stories at this website, but given it’s part of the iPhone operating system and seemingly free to use, it’s a whole lot better than simply relying on a robotic system voice from the phone itself.

Over on the App Store, Apple will provide a way for developers to show how their apps offer accessibility features with “Nutritional Labels” showing points of what’s provided. It’ll essentially be an optional thing, and from what we understand, developers can just add the points without adding a new build of their app, meaning devs can just go right in and make the changes as it rolls out.

Meanwhile, Apple is improving accessibility for the real world, combining devices to make aspects of life a little easier to navigate.

Using a combination of an iPhone and a MacBook — and likely a clip to hold the iPhone to the MacBook — you can use the iPhone’s camera to zoom in on something happening on a room, such as a chalkboard, and have Magnifier work with the camera on the iPhone using Continuity Camera.

That feature will relay the camera’s view to the Mac, and even allow you to tweak the colour and contrast to better see the details from afar. The text can even be extracted, allowing you to read any details without needing to study the chalkboard.

Likewise, Braille Access is being added across the ecosystem, providing support for Braille devices and providing connections for Apple services such as Live Captions to be used with braille displays.

The reason so much of this pops up around this time is largely to do with Accessibility Day.

While accessibility is an important topic at the best of times, the second Thursday of May (the day this story was published) is Global Accessibility Awareness Day, a day when talking about accessibility is important, including ways to improve the technological experience to be more inclusive overall.

Apple’s additions are largely rolling out with that, and will feature other spots across Apple Music, Fitness+, and Apple’s other services showcasing stories throughout the week.

Meanwhile, this assortment of features is on the cards throughout the year, and will likely be included as a part of what Apple has in store for operating system changes across iOS, iPadOS, visionOS, and macOS at WWDC 25 in June.

Read next