Do you know what people ignore on your web site? Do you know at
what point in your documentation customers stop reading and call
your tech support line? Do you know what users are doing between
clicks? Do you know whether your users get the intended impact from
Would you like to know?
Understanding User Behavior
We are The Advanced Eye Interpretation Project at Stanford
University. Our research focuses on the human side of eyetracking:
eye-movements, what they are, what they mean, and how they reflect
a computer user's mental states. We study eye-movement because when
a person uses a computer the person's eye-movements convey a continual
stream of information about his or her mental state. By modeling
typical eye-movement behavior we can develop tools to better understand
user behavior. We can also develop next-generation tools that will
enable people to control computers with their eyes.
As a user interacts with a computer our software runs in the background
recording the user's eye-movements, all keyboard and mouse activity,
and the images displayed on the computer screen. The user's eye-movements
are recorded by a top-of-the-line eyetracker, which allows the user
to interact comfortably with the computer.
Our visualization and analysis software correlates what was displayed
on the screen with keyboard and mouse activity and the user's eye-movements,
providing a clear picture of user activity and the ability to generate
numbers to quantify and qualify cross-subject data.
Our patented work on inferring mental states from eye-movement
patterns allows us to better describe what users are doing. This
ongoing research moves beyond merely asking where a person's eyes
are focused, but aims to infer high-level behaviors from observing
various patterns of eye-movement. Interesting results become apparent
when behaviors such as "reading" and "searching"
are analyzed as users interact with dynamic, complex applications.
The Future: Eye-Control of Computers
We can apply the ability to understand how users perceive current
technology to research next-generation technology. We have developed
(and are developing) the "Eye Interpretation Engine,"
which actually does the work of analyzing eye-movement in search
of recognizable patterns.
The Eye Interpretation Engine parses eye-position data into higher-level
patterns that can then be used to infer a user's mental state or
behavior. Because the Engine encapsulates our research on eye-movements,
applications can be written that make use of behavioral tokens without
requiring the application-developer to be an expert in eye-movement.
We ourselves used the Eye Interpretation Engine early on to write
an on-screen keyboard and mouse controller that enabled people to
"type with their eyes" and control the mouse. The keyboard
reacted differently to the user when the user was searching for
the next letter to type. Doing so increased user's enjoyment of
the system and reduced the number of incorrectly selected letters
because it allowed the user to search for the next letter at a leisurely
Apply Our Research to Your Problem
We are continuing to develop the tools and methods necessary to
do exciting research. Why do you look at a person's eyes? Because
they look nice? Because it's polite? Sure. But you can now look
because the person's eye-movements provide you with continual feedback
about the mental state of the person as he is studying your material.
Greg Edwards, Senior Researcher
Advanced Eye Interpretation Project
Center for the Study of Language and Information (CSLI)
Cordura Hall, Room 227
210 Panama Street
Stanford, CA 94305-4115
Updated: December 1, 1999