Could EEGs have prevented Clippy? Microsoft taps brain scan for UI work

Anyone who has had to perform in-depth usability tests on others knows that people just plain aren't good at describing what they are doing. In fact, it's a commonly-accepted mantra that, if asking someone to describe what he is doing on a particular user interface, it's wise to also record it or have an observer take notes. There are times when the user may say that he is doing something that is the complete opposite of what he is actually doing without even realizing it. (Thanks, Professor Burton, for beating that one into my head.) 老域名购买

Companies that develop some of the world's most well-known user interfaces, such as Microsoft and Apple, already know this, but it's still difficult at times to know exactly what the user might be thinking when navigating through a UI. And while someone has yet to invent a mind-reading device (which would no doubt be used to enhance people's love lives more than the usability of UIs), Microsoft wants to get closer to it by using an electroencephalograph (EEG) to read brain states of a user.

The company filed for a patent last year, titled "Using electroencephalograph signals for task classification and activity recognition," which describes a method for analyzing EEG signals as they correspond to different elements of a UI. The problem with EEGs, however, is that the signals are often muddied with extra data—the blink of an eye or the insatiable urge to scratch an itch can be enough to introduce some curveballs. Microsoft acknowledges that this can be a problem, and notes that there have been some efforts to filter out extra noise in EEG readings. This filtering is not always effective, though, and can be expensive to perform.

Because of this, Microsoft hopes to bypass the whole conundrum of filtering and instead focus on categorizing brain states that would then be applied when performing usability tests. This would involve taking sample data in a controlled environment, analyzing it for typical patterns (for example, the patent points out that "the P300 is a well-known signal pattern and is often referred to as the 'surprise' signal pattern"), and then categorizing it into different states of "interest" (which would vary depending on the tasks involved). The patent points out that that different brain states may exist during tasks such as high or low workload, math calculations, 3D image rotation, and computer games.

Ultimately, these brain states would then be applied to data taken during usability tests to analyze how people think when performing various computer tasks. Certain functions could cause much more cognitive activity than others (and in different ways), which would tell researchers how to tweak it to make it more user-friendly. By itself, the EEG data might provide Microsoft a new angle on usability, but maybe not an entirely complete one. That's why this data could be used in conjunction with other usability techniques, such as eye-tracking and screen capture programs, to create a more well-rounded assessment of a user's interaction with a given UI.