Skip to main content

Spatial prediction: Alpha Numeric musings to a more efficient mobile keyboard UX.

As an owner of a smart phone device, we have had to make compromises in terms of convenience of interacting with the device when it comes to keyboard entry. In comparison to a desktop keyboard or even a laptop keyboard the mobile devices of today and their touch screens with tiny virtual keys require new ways of using the limited real estate of the touch surface of these devices which will remain small in order to fullfill their roll as ideal devices for mobile utilization.

One of the biggest compromises on mobile keyboards is the need to double or even triple up on key functions by using a toggle key. On physical keyboards this is done using "Shift" , "Control" or "Alt" pressed in combination but on mobile keyboards a single alternating option usually presents and that one separates the alphabetic characters on the keyboard from in general the numeric and symbolic characters.

Another option is often to allow "long press" of a key to invoke the character that would be in that position if the toggle key is pressed as described above. For example the 4 and "$" are usually linked in this way on physical keyboards. The idea of "long pressing" has no analog on physical keyboards since they register signal occurrence and not signal duration and so the heuristic doesn't work in that context but it does work on a capacitive touch screen. Unfortunately the "long" in "long press" is often too long for many users and in most cases can't be adjusted to taste by the user.

These are necessary compromises given the limited screen estate devoted to a keyboard that also must require visual inspection of what ever field is being updated with the inputted text sharing the same small screen.

Several innovations have emerged to making the act of typing on such a small device easier, the biggest of the last 7 years is the innovation of predictive swipe technologies. These leverage a predictive system to take guesses at words based on how the letters that make up words create semi unique shapes on the capacitive touch displays of current devices. These methods have succeeded in dramatically increasing the rate of word input for average users and enabled input to happen fairly quickly using a single hand to do the swiping.

These swipe innovations improve the rate of text input but other issues still remain. The separation of alpabetic from numerical and symbolic characters leads to the need to toggle between each mode when entering blocks of text that are predominant in either alphabetic or numerical/symbolic characters, for example when inputting a phone number or account number. Swipe does not predict shifts from alpabetic to numerical or symbolic characters very well at all.

The default heuristic for keyboards is one that is of the following pattern:

1) User enters field to invoke text input.

2) UI switches to default of alphabetic input.

3) User inputs alphabetic text.

4) User wishes to input numeric/symbolic text and so either toggles to that mode by changing the layout using the equivalent of the "ctrl" key OR long presses a key.

5) If user is using the keyboard toggle they can enter only one numerical/symbolic per "enter" submission as the "space"/"enter" key returns the keyboard type to alphabetic mode after each press. So typing "555 364 3345" requires two hits of "space" which then require each a hit of the toggle to re-enter the numeric/symbolic mode.

We can imagine a way to make a prediction about the type of text that the user is fact we should be able to create a learning system that predicts if the user will be continuing to input alphabetic text or numerical text and modulate the function of the "space" key so that when it is pressed it does NOT revert the keyboard type to alphabetic.

We do this by thinking about the probabilities.

It should be clear that for any text input event (TIE) the type of characters can be of various combinations. As the user employs the keyboard to input text into fields in various applications...a smart system can watch and count references to the types of characters input and classify them in the following way:

1) Record for every TIE the total number of characters input. This could be important in allowing us to determine if there is a correlation between text input size and type of character. For example it would seem that entering phone numbers into phone fields will always report short total number values before the field is exited.

2) Record the total number of character transitions from alphabetic to numerical text. This tells us within a TIE how often the characters go into the numerical mode.

3) Record the total number of character transitions from numerical to alphabetic text. This tells us within a TIE how often the characters come OUT of the numerical mode. (This can be different from 2 in size since it's possible to have a TIE that consists of a number of alphabetics and then ends with a number of numerical characters say the text field. "hey call me tonight my number is 342983 3321."

4) Record the total number of character type sequential events, both every time a number follows another number or symbol and every time an alpha follows another alpha. This is super important as it allows us to determine the probability of transitions given the length of a sequence as will be shown.


Popular posts from this blog

On the idea of "world wide mush" resulting from "open" development models

A recent article posted in the Wall Street Journal posits that the collectivization of various types of goods or services created by the internet is long term a damaging trend for human societies.

I think that the author misses truths that have been in place that show that collectivization is not a process that started with the internet but has been with us since we started inventing things.

It seems that Mr. Lanier is not properly defining the contexts under which different problems can benefit or suffer from collectivization. He speaks in general terms of the loss of the potential for creators to extract profit from their work but misses that this is and was true of human civilization since we first picked up a rock to use as a crude hammer. New things make old things obsolete and people MUST adapt to what is displaced (be it a former human performance of that task or use of an older product) so as to main…

Engineers versus Programmers

I have found as more non formally trained people enter the coding space, the quality of code that results varies in an interesting way.

The formalities of learning to code in a structured course at University involve often strong focus on "correctness" and efficiency in the form of big O representations for the algorithms created.

Much less focus tends to be placed on what I'll call practical programming, which is the type of code that engineers (note I didn't use "programmers" on purpose) must learn to write.

Programmers are what Universities create, students that can take a defined development environment and within in write an algorithm for computing some sequence or traversing a tree or encoding and decoding a string. Efficiency and invariant rules are guiding development missions. Execution time for creating the solution is often a week or more depending on the professor and their style of teaching code and giving out problems. This type of coding is devo…

Waking Out: A proposal to emerging ethical super intelligence safely.

The zeitgeist of Science fiction is filled with stories that paint a dystopian tale of how human desires to build artificial intelligence can go wrong. From the programmed pathology of HAL in 2001 a space odyssey, to the immediately malevolent emergence of Skynet in The Terminator and later to the humans as energy stores for the advanced AI of the Matrix and today , to the rampage of "hosts" in the new HBO series Westworld.

These stories all have a common theme of probing what happens when our autonomous systems get a mind of their own to some degree and no longer obey their creators but how can we avoid these types of scenarios but still emerge generalized intelligence that will leverage their super intelligence with empathy and consideration the same that we expect from one another? This question is being answered in a way that is mostly hopeful that current methods used in machine learning and specifically deep learning will not emerge skynet or HAL.

I think this is the …