Control your iPad with your eyes Apple accessibility features to be updated

Apple today announced that it will launch software features designed for people with mobility, visual, hearing and cognitive impairments, including the ability to control the iPad using only their eyes; and a new service, SignTime, which will allow customers visiting Apple Store locations in the United States, United Kingdom and France to remotely contact sign language interpreters.

In a posting on its official website, Apple noted that with software updates to Apple’s operating system later this year, people with physical disabilities will be able to operate Apple Watch using Assistive Touch, which senses the nuances of muscle and tendon activity and allows users to use a variety of gestures, such as pinching or clenching, to control the cursor on the screen.

Later this year, iPad will support third-party eye-tracking devices, and compatible MFi-certified devices will be able to track where users are looking on the screen, with the cursor following the line of sight; sustained eye contact will trigger actions such as tapping.

For blind and low-vision communities, Apple’s “narrator” screen reader will add a picture description feature that describes the pose and position of the person in the picture. In addition, Apple will add support for two-way hearing aids so that people who are deaf or hard of hearing can make hands-free calls and FaceTime conversations.

In addition, the new SignTime service will allow customers to communicate with AppleCare and Apple customer service specialists in American Sign Language (ASI), British Sign Language (BSL) or French Sign Language (LSF) from their browser, and customers who visit Apple Store locations in person will be able to use SignTime to remotely contact a sign language interpreter. The service will be available in the U.S., U.K. and France first, and is expected to expand to other countries in the future.