All eyes are on Apple's WWDC event next month. At that time, the company will undoubtedly show off their latest operating system upgrades, including its latest artificial intelligence plans for iOS and macOS.
But AI doesn’t just mean generative AI. Apple is developing more traditional features for users to view, some of which are simply powered by artificial intelligence. As it happens, we've just seen a slew of upcoming iPhone and Mac features that fit this mold, specifically accessibility-related features.
Apple made a surprise announcement on Wednesday with a batch of new accessibility features. The company says the features will launch "later this year," which almost certainly means they'll be released with iOS 18. Apple switches between using the language of "artificial intelligence" and "machine learning" to describe how these features work, but rest assured the underlying technology is part of Apple's AI push this year.
Eye tracking lets you control your iPhone using just your eyes
Apple suddenly announced that iPhone and iPad users will soon be able to control their devices with their eyes. Apple says the front-facing camera on your phone or tablet will use artificial intelligence to calibrate, set up and power the feature. The most impressive thing is that you don't need any additional hardware to use it.
Once you set up eye tracking, you can navigate your app, interact with elements using dwell controls, and replicate physical buttons, swipes, and gestures with eye movements.
Music Touch lets you feel the beat through your iPhone
Apple has added a new music feature for users who are deaf or hard of hearing: Music Touch uses the Taptic Engine to play percussive sounds and complex vibrations along the beat of a song. While this sounds like a great accessibility feature, it also seems like a great way to enhance everyone's Apple Music experience. This feature is available for "millions" of songs in Apple Music, but Apple is also making it available as an API for developers to add to their apps.
Sound shortcuts and listening to atypical voices
Sound Shortcuts is a new feature that lets you assign actions to words or phrases. For example, you can set the word "ring" to open an Apple Watch activity ring in Fitness. Additionally, Listen to Atypical Speech uses on-device artificial intelligence to learn your speech patterns, so your device will recognize the way you speak.
These features are designed for users with conditions such as cerebral palsy, amyotrophic lateral sclerosis (ALS), or stroke, which can affect speech.
Vehicle motion tips to try to prevent motion sickness
Apple wants to cure motion sickness. When your iPhone or iPad recognizes that you're in a moving car, vehicle motion cues place dots on the screen. The dots will then move based on the direction of the vehicle: These moving dots may counteract the effects of motion sickness, which, as Apple says research shows, occurs when there's a conflict between what you see and what you feel , motion sickness will occur.
You can choose to display these action prompts automatically, or enable them manually from Control Center.
CarPlay gets some new accessibility features
Speaking of cars, CarPlay is getting a slew of new accessibility features: Voice Control, which lets you control CarPlay with your voice; Color Filters, which let you fine-tune the color space of the CarPlay UI; Notify you of sirens and other sounds.
VisionOS Accessibility
Remember Apple Vision Pro? Although it hasn't attracted much attention recently, this situation still exists. Still, Apple is developing some VisionOS accessibility features, including real-time subtitles. These captions work for in-person conversations, FaceTime conversations, and audio in apps. Apple has also added new visual features such as reduced transparency, smart invert, and dark flash, as well as support for Made for iPhone hearing devices and the Cochlear Hearing Processor.
New VoiceOver features
VoiceOver is getting new voices. Apple didn't reveal specific numbers, nor exactly how many, but they're coming. Additionally, the feature includes a "Flexible Voice Rotor" that lets you control how VoiceOver works, custom volume controls, customizable VoiceOver keyboard shortcuts on macOS, and support for custom vocabulary and complex words .
magnifier
Apple's Magnifying Glass doesn't get the love it deserves, but it's getting some new features. Coming soon, you'll get a new Reader Mode, as well as a quick way to launch Detection Mode using the action button on your iPhone 15 Pro.
braille
There are also some new Braille features: you'll have a new way to start and stay in Braille screen input, Japanese will be available, multi-line Braille is supported for Dot Pad users, and you can select input and output tables.
Hover typing
Hover typing is a new feature that increases the size of the text whenever you type in a text field. Plus, you can control fonts and colors.
Personal Voice now available in Mandarin
Apple last year launched Personal Voice, an AI-driven feature that replicates your voice during real-time speech sessions. This feature is now available in Mandarin. Additionally, you can now create a personal voice even if you have difficulty reading complete sentences aloud.
Speaking of Live Speech, the feature now comes with categories and is compatible with Live Subtitles.
virtual trackpad
Apple is adding virtual trackpad functionality as part of AssistiveTouch, so an area of the iPhone or iPad can be used to move the cursor around the screen. I think this would be useful for anyone who wants a trackpad experience, especially on a larger iPad, but doesn't have a physical trackpad to use.
switch control
With Switch Control coming later this year, you can use your iPhone or iPad's camera to recognize finger tap gestures as switches. Switch Control lets you use hardware to control your iPhone or iPad with switches, so this means you can control elements on the screen by making gestures with your fingers while in the camera view.