Demo: Fusing mobile sensors for paper keyboard on-the-go

Anh Nguyeny, Duy Nguyenyy, Nhan Nguyeny, Ashwin Ashok, Binh Nguyeny, Bao Phamy, Tam Vu

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Using touchscreens has largely limited user inputs to small formfactor devices. To address this constraint, we explore a novel input mechanism, dubbed PaperKey, that enables users to interact with mobile devices by performing multi-finger typing gestures on a surface where the device is placed. Using acceleration signals on the device, PaperKey infers the user's type events and then leverages a vision-based technique for detecting the exact typing locations on a paper keyboard layout. Compared to single audio, image, or vibration sensing, this work accurately localizes keystrokes with faster processing speed. Additionally, this mechanism keeps the mobility of devices by working without external sensors.

Original languageEnglish
Title of host publicationMobiSys 2017 - Proceedings of the 15th Annual International Conference on Mobile Systems, Applications, and Services
PublisherAssociation for Computing Machinery, Inc
Pages183
Number of pages1
ISBN (Electronic)9781450349284
DOIs
StatePublished - Jun 16 2017
Event15th ACM International Conference on Mobile Systems, Applications, and Services, MobiSys 2017 - Niagara Falls, United States
Duration: Jun 19 2017Jun 23 2017

Publication series

NameMobiSys 2017 - Proceedings of the 15th Annual International Conference on Mobile Systems, Applications, and Services

Conference

Conference15th ACM International Conference on Mobile Systems, Applications, and Services, MobiSys 2017
Country/TerritoryUnited States
CityNiagara Falls
Period06/19/1706/23/17

Keywords

  • Multi-finger typing
  • Paper keyboard
  • Touching vibration
  • Vision-based localization

Fingerprint

Dive into the research topics of 'Demo: Fusing mobile sensors for paper keyboard on-the-go'. Together they form a unique fingerprint.

Cite this