Google expands hands-free and eyes-free interfaces on Android

As part of 2024's Accessibility Awareness Day, Google is showing off some updates to Android that should be useful to folks with mobility or vision impairments.

Project Gameface allows gamers to use their faces to move the cursor and perform common click-like actions on desktop, and now it's coming to Android.

The project lets people with limited mobility use facial movements, like raising an eyebrow, moving their mouth or turning their head, to activate a variety of functions. There's basic stuff like a virtual cursor, but also gestures where, for instance, you can define the beginning and end of a swipe by opening your mouth, moving your head, then closing your mouth.

It's customizable to a person's abilities, and Google researchers are working with Incluzza in India to test and improve the tool. Certainly for many people, the ability to simply and easily play many of the thousands of games (well, millions probably, but thousands of good ones) on Android will be more than welcome.

There's a great video here that shows the product in action and being customized; Jeeja there in the preview image talks about changing how much she needs to move her head to active the gesture.

https://youtu.be/55T3HLuFLR0

That kind of granular adjustment is as important as someone able to set the sensitivity of your mouse or trackpad.

Another feature for folks who can't easily operate a keyboard, on-screen or physical: a new non-text "look to speak" mode that lets people choose and send emojis either on their own or as representatives for a phrase or action.

You can also add your own photos, so someone could have common phrases and emoji on speed dial and also pictures of commonly used contacts attached to photos of them, all accessible with a few glances.

For people with vision impairments, there are a variety of tools out there (of varying effectiveness, no doubt) that let a user identify things the phone's camera sees. The use cases are countless, so sometimes it's best to start with something simple, like finding an empty chair, or recognizing the person's keychain and pointing it out.

Users will be able to add custom object or location recognition so the instant description function will give them what they need and not just a list of generic objects like "a mug and a plate on a table." Which mug?!

Apple also showed off some accessibility features yesterday, and Microsoft has a few as well. Take a minute to peruse these projects, which seldom get the main-stage treatment (though Gameface did) but are of major importance to those for whom they are designed.

https://techcrunch.com/2024/05/15/apple-accessibility-features-2024