Sebastian Feuerstack and Mauro dos Santos Anjo and Ednaldo Brigante Pizzolato
New forms of interactions such as the gesture-based control of interfaces could enable interaction in situations where hardware controls are missing and support impaired people where other controls fails. The rich spectrum of combining hand postures with movements offers great interaction possibilities but requires extensive user testing to figure out a user interface navigation control with a sufficient performance and a low error rate. In this paper we describe a model-based interface design based on assembling interactors and multimodal mappings to design multimodal interfaces. We propose to use state charts for the rapid generation of different user interface controls to accelerate user testing.