Project Gameface: Control Your Cursor with Facial Gestures on Android

Update: 2024-05-15 12:42 IST

Google announced the release of Project Gameface's open-source code for Android developers, enabling integration of facial gesture control into apps. Users can manipulate the cursor through facial expressions, such as moving their mouths to navigate or raising eyebrows to click and drag.

Originally introduced at last year's Google I/O for desktop, Project Gameface interprets facial movements using the device's camera and MediaPipe's Face Landmarks Detection API.

"Through the device's camera, it seamlessly tracks facial expressions and head movements, translating them into intuitive and personalized control," Google explained in its announcement. "Developers can now build applications where their users can configure their experience by customizing facial expressions, gesture sizes, cursor speed, and more."

While initially designed for gamers, Google explores broader applications with partners like Incluzza, focusing on accessibility in various settings beyond gaming.

Inspired by quadriplegic gamer Lance Carr, Project Gameface aims to provide a cost-effective and accessible alternative to expensive head-tracking systems, enhancing accessibility in work, school, and social environments.

Tags:    

Similar News