Blind FLM: An Enhanced Keystroke-Level Model for Visually Impaired Smartphone Interaction - Human-Computer Interaction – INTERACT 2017 - Part I
Conference Papers Year : 2017

Blind FLM: An Enhanced Keystroke-Level Model for Visually Impaired Smartphone Interaction

Shiroq Al-Megren
  • Function : Author
  • PersonId : 1025903
Wejdan Altamimi
  • Function : Author
  • PersonId : 1025904

Abstract

The Keystroke-Level Model (KLM) is a predictive model used to numerically predict how long it takes an expert user to accomplish a task. KLM has been successfully used to model conventional interactions, however, it does not thoroughly render smartphone touch interactions or accessible interfaces (e.g. screen readers). On the other hand, the Fingerstroke-level Model (FLM) extends KLM to describe and assess mobile-based game applications, which marks it as a candidate model for predicting smartphone touch interactions.This paper aims to further extend FLM for visually impaired smartphone users. An initial user study identified basic elements of blind users’ interactions that were used to extend FLM; the new model is called “Blind FLM’”. Then an additional user study was conducted to determine the applicability of the new model for describing blind users’ touch interactions with a smartphone, and to compute the accuracy of the new model. Blind FLM evaluation showed that it can predict blind users’ performance with an average error of 2.36%.
Fichier principal
Vignette du fichier
421756_1_En_10_Chapter.pdf (284.14 Ko) Télécharger le fichier
Origin Files produced by the author(s)
Loading...

Dates and versions

hal-01676160 , version 1 (05-01-2018)

Licence

Identifiers

Cite

Shiroq Al-Megren, Wejdan Altamimi, Hend S. Al-Khalifa. Blind FLM: An Enhanced Keystroke-Level Model for Visually Impaired Smartphone Interaction. 16th IFIP Conference on Human-Computer Interaction (INTERACT), Sep 2017, Bombay, India. pp.155-172, ⟨10.1007/978-3-319-67744-6_10⟩. ⟨hal-01676160⟩
165 View
368 Download

Altmetric

Share

More