The contents in the “archives” were created and posted by the previous owners of this website. We are not responsible for any misleading or incorrect content that is posted here.
Specifically for the New York Times Infobae.
Thanks to tools like screen readers and adjustable text sizes, smartphones have slowly become more useful for people with a range of physical abilities.
With the recent release of Apple’s iOS 16 and Google’s Android 13 software, even more accessibility features have been introduced or enhanced, including improved live transcription tools and apps that use artificial intelligence to identify objects. For example, if you set it up, your phone can send you a visual alert when a baby is crying, or an audible alert when you approach a door.
And plenty of accessibility tools, old and new, make the phone easier for everyone to use. Here’s a guide.
On an iOS or Android phone, open the Settings app and choose Accessibility to see all available tools and features. Take your time to discover and experiment.
Apple and Google’s websites have dedicated accessibility sections for full reference, but remember that your exact capabilities will vary by software version and phone model.
Dragging or tapping to navigate a phone’s features isn’t for everyone, but iOS and Android offer multiple ways to move around screens and menus, including quick-tap shortcuts and gestures to get tasks done.
These controls (like Apple’s Assistive Touch tools and the Back Tap feature, which completes assigned actions when you tap the back of your phone) reside in iOS Touch settings.
Accessibility shortcuts on Android offer similar options. One way to access them is by opening the main settings icon, selecting System, then Gestures and system navigation.
Both platforms support navigation using third-party adaptive devices such as Bluetooth controls, or using the camera to recognize facial expressions associated with actions, e.g. B. Look left to swipe left. These devices and actions can be configured in the Switch Control and Head Tracking settings on iOS, or in the Camera Button and Enable Project apps for Android.
Apple and Google offer various tools for those who cannot see the screen. Apple’s iOS software offers the VoiceOver feature, and Android has a similar tool called TalkBack that provides audio descriptions of what’s showing on your screen (e.g. battery level) when you slide your finger over it.
If you turn on Voice Control for iOS or Voice Access for Android, you can use voice commands to control your phone. If you turn on the Speak content iOS setting or Android Select to speak, your phone reads out loud what’s on the screen… and can be useful for making corrections based on audio.
Don’t forget some of the classic ways of interacting with your phone hands-free. Apple’s Siri and Google Assistant can open apps and perform actions using voice commands. You can also enter text by speaking with Dictation (in the iOS keyboard settings) and Google voice typing.
In their accessibility settings, iOS and Android include shortcuts to focus on areas of the phone screen. However, if you generally like bigger, bolder text and other on-screen settings, open the Settings icon, select Accessibility, and select Display & text size. On Android go to Settings, then Accessibility and select Text & display.
The Magnifier app, Apple’s digital magnifying glass for magnifying objects in the camera view, has been improved in iOS 16. The app’s new features are designed to help blind or partially sighted people better use their iPhone to recognize doors and people nearby, as well as to identify and describe objects and surroundings.
Loupe results are spoken aloud or displayed in large font on the iPhone screen. Door and People Detection uses the LiDAR (light detection and measurement) scanner on the device to calculate distances and requires an iPhone 12 or later.
To set your preferences, open the Magnifier app and select the settings icon in the lower left corner; If you can’t find the app on your phone, you can download it for free from the App Store. Magnifier is just one of many vision tools on iOS, and the company’s website has instructions on how to set up the app on iPhone or iPad.
Google’s recently updated visual assistance app Lookout (a free download on the Play Store) can identify coins, text, food labels, objects, and more. Google introduced Lookout in 2018 and it works on Android 6 and above.
Both platforms offer controls to amplify the voice around you through your hearing aids. On iOS, go to the Audio/Visual section for hearing aid fittings. On Android, go to the Sound Amplifier settings.
With the iOS 16 update, Apple adds the Live Captions setting, a real-time transcription feature that converts the audible dialogue around you into text on the screen. Android Accessibility Toolbox includes the Instant Captioning setting that automatically captions videos, podcasts, video calls, and other audio media playing on your phone.
Google’s free Instant Transcribe app for Android converts nearby voices to on-screen text and can also provide visual alerts when your phone detects sounds like doorbells or smoke alarms. The sound detection tool in the hearing section of the iPhone’s accessibility settings does the same. And look for settings for multi-sensory notifications on your phone, like LED alerts or vibrating alerts, so you don’t miss a thing.