Apple's accessibility offerings continue to expand,Watch Kill Bill: Vol. 1 Online as new features like device eye tracking, touch enhanced music listening, and settings for those with atypical speech come to on-the-go devices.
Announced in the midst of a month-long recognition of Global Accessibility Awareness Day (May 16), the lineup of customizability options to help users with physical disabilities better control and interact with their iPad or iPhone.
SEE ALSO: Google I/O: Google announces new safety framework for responsible AI"These new features will make an impact in the lives of a wide range of users, providing new ways to communicate, control their devices, and move through the world," wrote Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives.
Apple's new eye tracking controls are, unsurprisingly, powered by AI, which turns the device's front-facing camera into a calibrating device to scan and track facial movements. "With Eye Tracking, users can navigate through the elements of an app and use Dwell Control to activate each element, accessing additional functions such as physical buttons, swipes, and other gestures solely with their eyes," Apple explains.
While eye-tracking systems for computers are a long established technology, mobile devices are slowly catching up. Apple — and other tech companies cashing in on quickly evolving AI technologies — capitalizes on the integrated nature of internal machine learning to process facial movements and migrate the tech into a hardware and accessory-free offering.
A feature that feels long overdue for the technically advanced Apple Music streaming service, Music Haptics allow for users who are Deaf or hard of hearing to experience music on their device via touch, by turning the iPhone's Taptic Engine into a conveyer of beats and vibrations. When turned on, the setting adds "taps, textures, and refined vibrations" to the music.
The feature will only be available on Apple Music's catalogue of songs, for now.
Acknowledging a range of varied speech ability and atypical speech patterns among people with disabilities, Vocal Shortcuts allow users to assign actions to custom utterances, not just phrases. The setting is paired with the new Listen for Atypical Speech setting, which uses on-device machine learning to recognize a user's unique speech, targeted to those with conditions that affect speech, such as cerebral palsy, amyotrophic lateral sclerosis (ALS), or stroke, Apple explains.
Apple also introduced improvements to its range of accessibility tools, including a Reader Mode for the app's vision assistant Magnifier, a new option of Hover Typing for those with low vision, a Virtual Trackpad for those using AssistiveTouch with limited range, and new customizations for VoiceOver and VoiceControl.
The company will be adding systemwide Live Captions to VisionOS, as well tools like Reduce Transparency, Smart Invert, and Dim Flashing Lights "for users who have low vision, or those who want to avoid bright lights and frequent flashing."
And, rounding out the additions, CarPlay users can now access Voice Control, Color Filters, and Sound Recognition, helping individuals access controls with just their voice, view color blind friendly screens, and be alerted to outside sounds.
Topics Apple Social Good Accessibility
(Editor: {typename type="name"/})
Alienware M16 Gaming Laptop deal: Save $560
Students fight back after diversity posters banned from school for being 'anti
Brits told to eat 10 fruit and veg a day, Twitter goes into utter meltdown
Why Facebook just changed its company logo
Best iPhone deal: Save $147 on the iPhone 15 Pro Max
Twitter exec teases possible major changes coming in 2020
The New York Times thinks you can handle 'The Truth'
Twitter's newest feature could finally bust your filter bubble
Episode 4: The Wave of the Future
Uber CEO Travis Kalanick needs to resign
Best IPL deal: Save $80 on Braun IPL Silk·Expert
Twitter exec teases possible major changes coming in 2020
接受PR>=1、BR>=1,流量相当,内容相关类链接。