Proceedings: GI 2012

Input finger detection for nonvisual touch screen text entry in Perkinput

Shiri Azenkot, Jacob Wobbrock, Sanjana Prasain, Richard Ladner

Proceedings of Graphics Interface 2012: Toronto, Ontario, Canada, 28 - 30 May 2012, 121-129

  • BibTex

    author = {Azenkot, Shiri and Wobbrock, Jacob and Prasain, Sanjana and Ladner, Richard},
    title = {Input finger detection for nonvisual touch screen text entry in Perkinput},
    booktitle = {Proceedings of Graphics Interface 2012},
    series = {GI 2012},
    year = {2012},
    issn = {0713-5424},
    isbn = {978-1-4503-1420-6},
    location = {Toronto, Ontario, Canada},
    pages = {121--129},
    numpages = {9},
    publisher = {Canadian Human-Computer Communications Society},
    address = {Toronto, Ontario, Canada},


We present Input Finger Detection (IFD), a novel technique for nonvisual touch screen input, and its application, the Perkinput text entry method. With IFD, signals are input into a device with multi-point touches, where each finger represents one bit, either touching the screen or not. Maximum likelihood and tracking algorithms are used to detect which fingers touch the screen based on user-set reference points. The Perkinput text entry method uses the 6-bit Braille encoding with audio feedback, enabling one- and two-handed input. A longitudinal evaluation with 8 blind participants who are proficient in Braille showed that one-handed Perkinput was significantly faster and more accurate than iPhone's VoiceOver. Furthermore, in a case study to evaluate expert performance, one user reached an average session speed of 17.56 words per minute (WPM) with an average uncorrected error rate of just 0.14% using one hand for input. The same participant reached an average session speed of 38.0 WPM with two-handed input and an error rate of just 0.26%. Her fastest phrase was entered at 52.4 WPM and no errors.