Proceedings: GI 2009

Who dotted that ‘i’?: context free user differentiation through pressure and tilt pen data

Brian Eoff, Tracy Hammond

Proceedings of Graphics Interface 2009: Kelowna, British Columbia, Canada, 25 - 27 May 2009, 149-156

  • BibTex

    @inproceedings{Eoff:2009:,
    author = {Eoff, Brian and Hammond, Tracy},
    title = {Who dotted that ‘i’?: context free user differentiation through pressure and tilt pen data},
    booktitle = {Proceedings of Graphics Interface 2009},
    series = {GI 2009},
    year = {2009},
    issn = {0713-5424},
    isbn = {978-1-56881-470-4},
    location = {Kelowna, British Columbia, Canada},
    pages = {149--156},
    numpages = {8},
    publisher = {Canadian Human-Computer Communications Society},
    address = {Toronto, Ontario, Canada},
    }

Abstract

With the proliferation of tablet PCs and multi-touch computers, collaborative input on a single sketched surface is becoming more and more prevalent. The ability to identify which user draws a specific stroke on a shared surface is widely useful in a) security/forensics research, by effectively identifying a forgery, b) sketch recognition, by providing the ability to employ user-dependent recognition algorithms on a multi-user system, and c) multi-user collaborative systems, by effectively discriminating whose stroke is whose in a complicated diagram. To ensure an adaptive user interface, we cannot expect nor require that users will self-identify nor restrict themselves to a single pen. Instead, we prefer a system that can automatically determine a stroke's owner, even when strokes by different users are drawn with the same pen, in close proximity, and near in timing. We present the results of an experiment that shows that the creator of an individual pen strokes can be determined with high accuracy, without supra-stroke context (such as timing, pen-ID, nor location), and based solely on the physical mechanics of how these strokes are drawn (specifically, pen tilt, pressure, and speed). Results from free-form drawing data, including text and doodles, but not signature data, show that our methods differentiate a single stroke (such as that of a dot of an 'i') between two users at an accuracy of 97.5% and between ten users at an accuracy of 83.5%.