CRTypist: Simulating Touchscreen Ty** Behavior via Computational Rationality

D Shi, Y Zhu, JPP Jokinen, A Acharya… - Proceedings of the CHI …, 2024 - dl.acm.org
Touchscreen ty** requires coordinating the fingers and visual attention for button-
pressing, proofreading, and error correction. Computational models need to account for the …

Unblind Text Inputs: Predicting Hint-text of Text Input in Mobile Apps via LLM

Z Liu, C Chen, J Wang, M Chen, B Wu… - Proceedings of the CHI …, 2024 - dl.acm.org
Mobile apps have become indispensable for accessing and participating in various
environments, especially for low-vision users. Users with visual impairments can use screen …

Improving FlexType: Ambiguous Text Input for Users with Visual Impairments

D Gaines, K Vertanen - … of the 17th International Conference on …, 2024 - dl.acm.org
We present an improved version of the FlexType interface for nonvisual text input. FlexType
enables nonvisual text input on mobile touchscreen devices by allowing users to select from …

Perceptions of Blind Adults on Non-Visual Mobile Text Entry

D Gaines, K Vertanen - arxiv preprint arxiv:2410.22324, 2024 - arxiv.org
Text input on mobile devices without physical keys can be challenging for people who are
blind or low-vision. We interview 12 blind adults about their experiences with current mobile …

An Ambiguous Technique for Nonvisual Text Entry

DC Gaines - 2023 - search.proquest.com
Text entry is a common daily task for many people, but it can be a challenge for people with
visual impairments when using virtual touchscreen keyboards that lack physical key …