Publication Details
Download |
Daniel Buschek, Florian Alt
ProbUI: Generalising Touch Target Representations to Enable Declarative Gesture Definition for Probabilistic GUIs In CHI '17: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. Denver, CO, USA, May 6 - 11, 2017. ACM, New York, NY, USA. (bib) |
We present ProbUI, a mobile touch GUI framework that merges ease of use of declarative gesture definition with the benefits of probabilistic reasoning. It helps developers to handle uncertain input and implement feedback and GUI adaptations. ProbUI replaces today's static target models (bounding boxes) with probabilistic gestures ("bounding behaviours"). It is the first touch GUI framework to unite concepts from three areas of related work: 1) Developers declaratively define touch behaviours for GUI targets. As a key insight, the declarations imply simple probabilistic models (HMMs with 2D Gaussian emissions). 2) ProbUI derives these models automatically to evaluate users' touch sequences. 3) It then infers intended behaviour and target. Developers bind callbacks to gesture progress, completion, and other conditions. We show ProbUI's value by implementing existing and novel widgets, and report developer feedback from a survey and a lab study. |