Dasher

Role-play—Adaptive Technologies

Natural Language Processing (NLP) and Machine Learning (ML) can have a profound impact on how we interact with computers. We may not have computers that converse with us just yet, but the strides toward realizing this have enabled many advances in HCI (Human-Computer Interaction).

Rather than focus on obvious in-roads, such as Siri, let’s look at an interface that utilizes NLP in a more subdued way.

Imagine that you are quadriplegic, without the use of your hands, feet, arms, or legs. Movement is restricted to the neck up. You text friends using a rod you control with your mouth to individually punch keys on a keyboard. It’s exhausting and very slow. Making mistakes is painful; one mis-pressed keystroke becomes at least three:

  • the original mistaken character,
  • the backspace key, and
  • the (hopefully) correct keystroke.

Unfortunately, hypothetical you has also lost most of your speaking capacity. The motor neurons that control the muscles that produce speech have been damaged. Imagine how improvements in NLP and speech recognition have helped others with similar conditions. Instead of manually typing text, they may speak it aloud as the computer transcribes. Predictive techniques generally “guess” the correct word when in doubt, and uncaught errors can then be corrected by keyboard.

Dasher

Dasher is an interface designed in the UK that has helped many in similar situations of impairment.

Explore the Dasher site, and if possible, download and execute an instance of Dasher. Experiment with the interface, and record your answers to the following questions:

  1. Some of the squares corresponding to letters are much larger than others. Also, sometimes one letter will be large, and other times, it will be relatively small. Why? What significance does the size of each letter’s area have?
  2. If you move randomly (but not wildly), actual words and phrases tend to pop up rather than plain gibberish. Why do you think that is?
  3. How might this experience be different if the application were trained on a different language? Why?
  4. What effect on performance do you anticipate if the application continually retrained on your input?
  5. Spell some common words and phrases like, “Hello how are you today” and some uncommon words and phrases like, “my uncle used to love me but she died.” How do these experiences differ?
  6. How is Dasher similar to autocorrect?

Dasher +

In 2016, an article was published describing an extension to Dasher to work through a brain-computer interface (BCI) entitled A Brain–Computer Interface for the Dasher Alternative Text Entry System. The article describes “a new software tool was developed to allow subjects to type words onto a computer screen via Dasher using their thoughts.”

  1. What are the implications for this technology?
  2. Why is a BCI-enabled Dasher more accessible than a linguistic BCI system?