The ubiquitous virtual keyboards found on smartphones, tablets, and other touchscreen devices might someday be replaced by an invisible equivalent, if researchers at the Korea Advanced Institute of Science and Technology have their way. In a fascinating study published on the preprint server Arxiv.org this week (“I-Keyboard: Fully Imaginary Keyboard on Touch Devices Empowered by Deep Neural Decoder“), they propose a “fully imaginary” keyboard — the I-Keyboard — lacking a predefined layout, shape, or size, that taps AI to detect typing from any position at any angle. Novelly, it doesn’t require calibration, and the researchers claim that most people manage to achieve 95.84% typing accuracy with it compared with a conventional virtual keyboard.

“Contemporary soft keyboards possess a few limitations. In fact, current soft keyboard techniques damage the usability of mobile devices in multiple ways other than the mobility,” wrote the coauthors, who point out that the lack of tactile feedback generally increases the rate of typos,” the paper reads. “[Also], soft keyboards hinder mobile devices from presenting enough content because they occupy a relatively large portion on displays. Mobile devices provide smaller displays than non-mobile devices in general and soft keyboards can fill up to 40% of displays.”

To solve those formidable challenges, the scientists first compiled a data set by recruiting 43 participants who regularly use both physical QWERTY keyboards and soft keyboards. They had them type sentences on a touchscreen (which displayed no keys with the exception of a delete key and enter key), following instructions conveyed by a separate screen situated above the touchscreen. As they typed, the second screen highlighted each character at detection time, ensuring one-to-one mapping between touchpoints. And at any point, the users could delete touchpoints collected for the current sentence in the event they made a mistake.

I-Keyboard

Above: The data collection setup.

Study participants warmed up for 15 sentences before transcribing 150 to 160 sentences randomly sampled from Twitter and 20 Newsgroup data sets (preprocessed so that the sentences only included 26 English letters in lower case, enter, space, period, and apostrophe). Each participant took about 50 minutes and contributed to an overall corpus of 7,245 phrases and 196,194 keystrokes.

After normalizing the scales and removing the location offsets, the researchers found that the keys didn’t align on straight lines but rather on curves, which they believe resulted from the lack of restrictions on typing. And although each participant typed in slightly different ways, models in the corpus “consistently” resembled a physical keyboard layout, which the team claims is an indication that users can reliably type on touchscreens even without guidance.

The researchers next devised the I-Keyboard’s system architecture, which comprised three modules: a user interaction module, a preparation module, and a communication layer. The first received input through a touchscreen or touch interface, while the data preparation module preprocessed and formatted raw inputs, and the communication layer tightly integrated the machine learning framework and app framework.

I-Keyboard

Above: Examples of user mental models.

After divvying up the corpus into training, test, and validation sets and training the machine learning model on the former, the team deployed a prototypical I-Keyboard on a MacBook Pro. In experiments that tasked study participants with typing 20 phrases randomly selected from a phrase set, participants managed to type at 45.57 words per minute (or about 88.74% of the typing speed with a physical keyboard), surpassing the baseline by 4.06%.

The paper’s coauthors assert the I-Keyboard’s ease of use made it easy to get the hang of relatively quickly.

“[U]sers do not need to learn any new concept regarding I-Keyboard prior to usage. They can just start typing naturally by transferring the knowledge of physical keyboards,” they wrote. “[They] can keep typing even when they have taken off their hands phrase after phrase without an additional calibration step.”

The current iteration of I-Keyboard can support smartphones “with a few adjustments,” the researchers say, but their plan is to extend it to other touchscreens and touch-sensing devices in the future. Additionally, they intend to implement support for nonalphabetic characters (e.g., numbers, punctuation and function keys), potentially by adding a gesture detection algorithm that could be assigned to different keys.