Semantic Modelling in Support of Adaptive Multimodal Interface Design
Abstract
The design of multimodal interfaces requires intelligent data interpretation in order to guarantee seamless adaptation to the user’s needs and context. HMI (human-machine interaction) design accommodates varying forms of interaction patterns, depending on what is most appropriate for a particular user at a particular time. These design patterns are a powerful means of documenting reusable design know-how. The semantic modelling framework in this paper captures the available domain knowledge in the field of multimodal interface design and supports adaptive HMIs. A collection of multimodal design patterns is constructed from a diversity of real-world applications and organized into a meaningful repository. This enables a uniform and unambiguous description easing their identification, comprehensibility and applicability.
Origin | Files produced by the author(s) |
---|
Loading...