Home  | Publications | UWB+22

ThumbPitch: Enriching Thumb Interaction on Mobile Touchscreens Using Deep Learning

MCML Authors

Sven Mayer

Prof. Dr.

Associate

* Former Associate

Abstract

Today touchscreens are one of the most common input devices for everyday ubiquitous interaction. Yet, capacitive touchscreens are limited in expressiveness; thus, a large body of work has focused on extending the input capabilities of touchscreens. One promising approach is to use index finger orientation; however, this requires a two-handed interaction and poses ergonomic constraints. We propose using the thumb’s pitch as an additional input dimension to counteract these limitations, enabling one-handed interaction scenarios. Our deep convolutional neural network detecting the thumb’s pitch is trained on more than 230,000 ground truth images recorded using a motion tracking system. We highlight the potential of ThumbPitch by proposing several use cases that exploit the higher expressiveness, especially for one-handed scenarios. We tested three use cases in a validation study and validated our model. Our model achieved a mean error of only 11.9°.

inproceedings


OZCHI 2022

34th Australian Conference on Human-Computer Interaction. Canberra, Australia, Nov 29-Dec 02, 2022.

Authors

J. Ullerich • M. Windl • A. Bulling • S. Mayer

Links

DOI

Research Area

 C5 | Humane AI

BibTeXKey: UWB+22

Back to Top