TY - GEN
T1 - TYTH-typing on your teeth
T2 - 16th ACM International Conference on Mobile Systems, Applications, and Services,MobiSys 2018
AU - Nguyen, Phuc
AU - Truong, Hoang
AU - Pham, Duy
AU - Bui, Nam
AU - Suresh, Abhijit
AU - Dinh, Thang
AU - Nguyen, Anh
AU - Whitlock, Matt
AU - Vu, Tam
N1 - Publisher Copyright:
© 2018 Association for Computing Machinery.
PY - 2018/6/10
Y1 - 2018/6/10
N2 - This paper explores a new wearable system, called TYTH, that enables a novel form of human computer interaction based on the relative location and interaction between the user’s tongue and teeth. TYTH allows its user to interact with a computing system by tapping on their teeth. This form of interaction is analogous to using a finger to type on a keypad except that the tongue substitutes for the finger and the teeth for the keyboard. We study the neurological and anatomical structures of the tongue to design TYTH so that the obtrusiveness and social awkwardness caused by the wearable is minimized while maximizing its accuracy and sensing sensitivity. From behind the user’s ears, TYTH senses the brain signals and muscle signals that control tongue movement sent from the brain and captures the miniature skin surface deformation caused by tongue movement. We model the relationship between tongue movement and the signals recorded, from which a tongue localization technique and tongue-teeth tapping detection technique are derived. Through a prototyping implementation and an evaluation with 15 subjects, we show that TYTH can be used as a form of hands-free human computer interaction with 88.61% detection rate and promising adoption rate by users.
AB - This paper explores a new wearable system, called TYTH, that enables a novel form of human computer interaction based on the relative location and interaction between the user’s tongue and teeth. TYTH allows its user to interact with a computing system by tapping on their teeth. This form of interaction is analogous to using a finger to type on a keypad except that the tongue substitutes for the finger and the teeth for the keyboard. We study the neurological and anatomical structures of the tongue to design TYTH so that the obtrusiveness and social awkwardness caused by the wearable is minimized while maximizing its accuracy and sensing sensitivity. From behind the user’s ears, TYTH senses the brain signals and muscle signals that control tongue movement sent from the brain and captures the miniature skin surface deformation caused by tongue movement. We model the relationship between tongue movement and the signals recorded, from which a tongue localization technique and tongue-teeth tapping detection technique are derived. Through a prototyping implementation and an evaluation with 15 subjects, we show that TYTH can be used as a form of hands-free human computer interaction with 88.61% detection rate and promising adoption rate by users.
KW - Brain-muscles sensing
KW - Human Computer Interaction (HCI)
KW - Skin deformation sensing
KW - Tongue-teeth typing
KW - Wearable devices
UR - http://www.scopus.com/inward/record.url?scp=85051486728&partnerID=8YFLogxK
U2 - 10.1145/3210240.3210322
DO - 10.1145/3210240.3210322
M3 - Conference contribution
AN - SCOPUS:85051486728
T3 - MobiSys 2018 - Proceedings of the 16th ACM International Conference on Mobile Systems, Applications, and Services
SP - 269
EP - 282
BT - MobiSys 2018 - Proceedings of the 16th ACM International Conference on Mobile Systems, Applications, and Services
PB - Association for Computing Machinery, Inc
Y2 - 10 June 2018 through 15 June 2018
ER -