Skin touches as an input method EgoTouch: Recognize intensity and directions of movement on the skin surface

From Hendrik Härter | Translated by AI 2 min Reading Time

Related Vendors

The goal of EgoTouch is to use the skin surface as an input method for VR/AR applications. Additional hardware is not necessary. Only cameras in the headset and machine learning are used to detect the intensity and direction of touches on the skin.

Cameras detect touches on the skin surface. Machine learning methods evaluate the movements.(Image: Human-Computer Interaction Institute)
Cameras detect touches on the skin surface. Machine learning methods evaluate the movements.
(Image: Human-Computer Interaction Institute)

Researchers at Carnegie Mellon University in the USA have developed a technique called EgoTouch that enables the use of skin touches as an input method—without additional hardware. Together with artificial intelligence and the integrated cameras in standard headsets, the system accurately detects touches, their intensity, and movement direction on the skin surface.

An earlier project called OmniTouch, developed by Chris Harrison, Professor and Head of the Future Interfaces Group, used similar approaches but required a special depth camera, which was not only expensive but also impractical. Vimal Mollyn, a doctoral student on Harrison's team, had the idea to use machine learning to train regular cameras to detect touches.

When machine learning assists

"When you place your finger on the skin, you see shadows and deformations of the skin that occur only upon touch," explains Mollyn. "We've used these features to train a model that reliably detects touches."

For data collection, the team developed a special touch sensor that was attached to the inside of the index finger and the palm. This sensor gathered data on different types and intensities of touches without being visible to the camera. Using this data, the model was trained to associate visual features like shadows and skin deformations with touch intensity—entirely without manual annotation.

To optimize the model, the team collected data from 15 users with different skin tones and hair structures. The touch data was recorded over several hours in different situations, activities, and lighting conditions.

Video: How EgoTouch works

Precision and practical applications

The result: EgoTouch recognizes touches with an accuracy of over 96 percent and a low error rate of about 5 percent. Additionally, the system can distinguish between touch types such as pressing, releasing, and swiping. Even the intensity—whether light or strong—is classified with an accuracy of 98 percent. "This enables features like a right-click directly on the skin," says Mollyn.

The accuracy remains consistent, regardless of skin color, hair texture, or touch area—whether on the palm, forearm, or back of the hand. Only in areas like the knuckle does the system work less reliably because the skin deforms less there.

A look into the future

In the future, the researchers plan to expand EgoTouch with night vision cameras to enable the system to function in low light conditions. Additionally, efforts are being made to apply the technology to surfaces other than skin.

"For the first time, we have developed a system that works with the standard cameras of common AR/VR headsets—without calibration, ready for use," emphasizes Mollyn. "Now we can build on our previous work on skin interfaces and finally realize this vision."

EgoTouch opens new perspectives for developers: The combination of machine learning and minimal hardware could give AR/VR a decisive innovation boost—with the skin as an intuitive input surface. (heh)

Link: OmniTouch: Wearable Multitouch Interaction Everywhere. (external link)

Subscribe to the newsletter now

Don't Miss out on Our Best Content

By clicking on „Subscribe to Newsletter“ I agree to the processing and use of my data according to the consent form (please expand for details) and accept the Terms of Use. For more information, please see our Privacy Policy. The consent declaration relates, among other things, to the sending of editorial newsletters by email and to data matching for marketing purposes with selected advertising partners (e.g., LinkedIn, Google, Meta)

Unfold for details of your consent