Special for Infobae of The New York Times.
Smartphones have slowly become more useful to people with a range of physical abilities, thanks to tools like screen readers and adjustable text sizes.
With the recent release of Apple’s iOS 16 and Google’s Android 13 software, even more accessibility features have been introduced or improved, including improved live transcription tools and apps that use artificial intelligence to identify objects. For example, if you set it up, your phone can send you a visual alert when a baby is crying or a sound alert when you’re approaching a door.
And many accessibility tools, old and new, make it easier for everyone to use the phone. Here’s a guide.
On an iOS or Android phone, open the Settings app and select Accessibility to find all available tools and features. Take the time to explore and experiment.
For complete reference, the Apple and Google websites have dedicated Accessibility sections, but keep in mind that your exact features will vary depending on your software version and phone model.
Dragging or tapping to navigate a phone’s features isn’t for everyone, but iOS and Android offer several ways to move around screens and menus, including quick tap shortcuts and gestures to do chores.
These controls (such as Apple’s Assistive Touch tools and its Back Tap feature, which completes assigned actions when you tap the back of your phone) are in the iOS Touch settings.
Accessibility shortcuts on Android offer similar options. One way to access them is by opening the main Settings icon, selecting System, then Gestures and System Navigation.
Both platforms support navigation using third-party adaptive devices such as Bluetooth controls or using the camera to recognize facial expressions mapped to actions, such as looking left to swipe left. These devices and actions can be configured in the Switch Control and Head Tracking settings on iOS or the Camera Button and Project Activate apps for Android.
Apple and Google provide various tools for those who cannot see the screen. Apple’s iOS software offers the VoiceOver feature, and Android has a similar tool called TalkBack, which has audio descriptions of what’s on your screen (like your battery level) as you move your finger over it.
If you turn on Voice Control for iOS or Voice Access for Android, you can control your phone with spoken commands. If you turn on the iOS Speak Content or Android Select to Speak setting, your phone reads what’s on the screen aloud… and can be useful for making corrections based on audio.
Don’t forget some of the classic methods of hands-free interaction with your phone. Apple’s Siri and Google Assistant can open apps and perform actions with spoken commands. Plus, the Dictation feature (in iOS Keyboard settings) and Google Voice Typing let you type text by speaking.
In their Accessibility settings, iOS and Android include shortcuts to focus on sections of the phone screen. However, if you generally like larger, bolder text and other settings on the screen, open the Settings icon, choose Accessibility, and select Display & Text Size. On Android, go to Settings, then Accessibility, and choose Text & Display.
The Magnifier app, Apple’s digital magnifying glass for enlarging objects in the camera view, has been enhanced in iOS 16. The app’s new features are designed to help people who are blind or have low vision use their iPhones better. to detect doors and nearby people, as well as to identify and describe objects and environments.
Magnifier results are spoken aloud or displayed in a large font on the iPhone screen. Door and person detection uses the on-device LiDAR (light detection and measurement) scanner to calculate distances and requires an iPhone 12 or later.
To set your preferences, open the Magnifier app and select the Settings icon in the bottom left corner; If you can’t find the app on your phone, it’s free to download from the App Store. Magnifier is just one of many vision tools on iOS, and the company’s site has a guide to setting up the app on the iPhone or iPad.
Google’s recently updated vision-assistive app Lookout (a free download on the Play Store) can identify coins, text, food labels, objects, and more. Google introduced Lookout in 2018 and it works on Android 6 and higher.
Both platforms offer controls to amplify the voice around you through your hearing aids. On iOS, go to the Audio/Visual section for Hearing Aid Fittings. On Android, visit the Sound Amplifier settings.
With the iOS 16 update, Apple is including the Live Captions setting, a real-time transcription feature that converts the audible dialog around you into text on the screen. The Android Accessibility Toolbox includes the Instant Captioning setting that automatically captions videos, podcasts, video calls, and other audio media played on your phone.
Google’s free Instant Transcribe app for Android converts nearby voices to text on the screen and can also provide visual alerts when your phone recognizes sounds like doorbells or smoke detectors. The Sound Recognition tool in the Hearing section of iPhone’s Accessibility settings does the same thing. And look for settings for multi-sensory notifications on your phone, like LED alerts or vibrating alerts, so you don’t miss a thing.