WHAT IS EYE CONTROL?
Using one’s eyes is a natural method of demonstrating interest in an object or person and of gaining joint attention with a communication partner. Looking at an object or person in the immediate environment, or looking at choices that are displayed on an eye gaze frame is a ‘no-tech’ or ‘low-tech’ gaze control method for indicating choice and achieving communication.
However for the purposes of this article, eye control refers to using a high-tech, computer-based system that reads eye movements and translates those into a target area on a computer screen. Using the ability to select areas on the screen allows the user to type by looking at letters on an on-screen keyboard, access and control the entire desktop or make selections from a pre-designed communication interface.
All eye trackers consist of three basic elements and use mathematical algorithms to ascertain the point on the screen at which the user is looking. Very simply, eye control requires an infrared light source to illuminate the pupils of the eye, a camera to track the user’s head and pupils and software to calculate the corresponding screen area at which the user is directing their gaze.
Once the user has looked at the target on screen, a selection is made either by blinking, pressing a switch, or by means of a ‘dwell’ selection. A dwell selection requires the user to maintain their gaze on the target for a specified, adjustable amount of time. Typical times range from 0.5 seconds to 2 seconds but are very dependent on what is successful and comfortable for the user.
WHO CAN USE EYE CONTROL?
Traditionally, eye control has been used by those with no other means of communication or control of any other part of their body. This might include people with high spinal lesions, quadriplegia, traumatic brain injury (TBI), people who have had a stroke or have locked-in syndrome, those with MND/ALS, people with Athetoid Cerebral Palsy, Muscular Dystrophy, Rett’s Syndrome etc. For anybody that has no other reliable movement, or who has great difficulty controlling their movements, eye control can offer a much more comfortable access method. Eye control is also sometimes chosen as an additional access method by those who can use another option but find it difficult, painful, slow or frustrating to do so. For this group of people it is often more comfortable, less fatiguing, faster and less physically demanding to use their eyes and they can choose which access method to use for different tasks.
EYE CONTROL’S EFFECT ON COMFORT AND FATIGUE
Eye muscle control seems not to be involved in the ATNR and involuntary movements that are often triggered by the use of other muscles, so people usually sit quite still and comfortably when using their eyes to control a computer. This is especially true once the cognitive load and ‘pressure’ of using a new skill has reduced. As the person becomes accustomed to the concept of using their eyes in a deliberate and controlled way, their body usually relaxes, the physical effort involved in maintaining their position and gaze usually decreases, and the subsequent involuntary, often uncomfortable, movements also decrease.
VISION: A COMPLEX SENSE
Vision is a complex sense. Visual acuity is required to see an object clearly; cognitive recognition is required to identify what the object is; visual attention and oculomotor ability is required to search for the object and shift or maintain gaze on the desired object; and adequate head control is required to facilitate all of the above. A person who uses eye control should have the visual and cognitive abilities to see sufficiently clearly and to identify targets on the screen, however even complex visual perceptual and cognitive difficulties can be accommodated in the design of the interfaces with which the person interacts.
SELECTING AN APPROPRIATE EYE TRACKING DEVICE
Choice of an appropriate eye control device is dependent on the user’s requirements. Some eye trackers can accommodate involuntary head or eye movement (nystagmus) while others require the user to sit very still. There are two main categories of eye tracker: those that clip on to a monitor or communication device, and those that are integrated into (built into) the computer. Clip-on eye trackers have the advantage of being independent of changing technology, while integrated eye trackers usually offer more communication features than ordinary laptop/desktop computers.
Some work well with popular symbol or text-based communication software such as The Grid 3 or Tobii Communicator 5. The specific needs, physiological characteristics and preferences of the user must be carefully considered so that the most appropriate eye tracker can be selected.
Once the device has been selected, appropriate mounting is also required. The eye tracker must be placed so that the camera has an uninterrupted view of the user’s eyes and is placed parallel to the user’s head, thus if the user is lying on their side, the screen and eye tracker should also be mounted sideways.
SUPPORTIVE INTERACTION FEEDBACK
A critical factor for successful eye control use is the way that the user experiences the mouse movement that they are creating with their eyes. For people with cognitive difficulties it can be confusing to see the mouse cursor moving on the screen and not realize that they are in fact moving it. A ‘chasing’ effect then occurs where the user tries to look at the very object they are moving. It can also be confusing to some users if they are looking directly at their desired target but the mouse cursor is a short distance off target, which can happen if the system is not perfectly calibrated. Some people are able to accommodate that difference and look ‘beyond’ their target, but others require more supportive strategies.
One method of addressing these challenges is to hide the mouse cursor. The user only sees a highlighted border around the selected target, for example. Another useful interaction tool is to provide the user with visual feedback on the length of time required to activate the selection. This is often presented as a clock timer, so the user knows to keep their gaze still until the clock has completed a full rotation. Critically, this visual feedback is also presented in the centre of the target, making the user feel accurate, even if their calibration is slightly off.
ASSESSMENT PROCEDURES
Background information relating to the user’s visual abilities, motivating interests, cognitive abilities, level of literacy and receptive/expressive language should be gathered prior to the assessment. The expectations of the client and their support team should be investigated so that these can be managed, and so that there is something against which to measure the outcome of the assessment. There are likely to be many people involved or interested in attending the assessment, however the more people that attend the greater the potential for pressure on the client to succeed. Also those who do attend should do so quietly and discreetly, allowing the assessor and client to focus entirely on the task at hand. The environment (noise, lighting, space, temperature) should be considered – an unexpected noise might trigger a ‘startle reflex’ and impact their performance on the eye tracker, for example.
CALIBRATION PROCESS
The calibration process is the process by which the eye control system learns to interpret where the user is looking on the screen. The better the calibration, the more accurate the eye control will be. Calibrating usually involves looking at several pre-defined areas of the screen, by focusing on a target (often a coloured dot) that is presented in that area. Some eye trackers allow for customisable calibration stimuli, so that a photo could be used instead etc.
GRADED AND FAILURE-PROOF ASSESSMENT ACTIVITIES
Finding and creating assessment activities that are motivating, interesting and fun for the user is critical for their involvement. Activities should be designed in such a way that there is no potential for failure. Initially, the user should be allowed to explore eye tracking in a fail-proof series of activities. Once they are feeling confident it might become appropriate to increase the demands of the activity. The goal of the assessment should simply be to see whether the access method works – what the client eventually does with the eye tracker can be customised and adapted over time, ideally with input from the clients themselves.
PARTICIPATORY DESIGN
Involving the client in the design and optimization of their interfaces is also a critical factor for successful use for some. Because visual perceptual difficulties are so often associated with stroke, CP, TBI etc, it is very helpful to ask the client what their preferences for text colour, position, size etc are if at all possible.
MODIFICATION AND SUPPORT
The client’s use of the eye tracker may change over time. They may want to add more functionality to their interfaces, or become more practiced at operating the eye control and thus able to manage smaller targets, their physical condition may change etc. so the design and layout, as well as the positioning of the eye tracker should be revisited from time to time. The eyes are a muscle just like any other and will require exercise in order to build up stamina. Eye drops may be helpful for moisturizing tired eyes however it is best to moisturize after an eye control session rather than at the start, as the liquid can create additional reflections in the eye which could affect eye tracking.
References
Donegan, M., & Oosthuizen, L. (2006). The ‘KEE’ Concept for eye-control and complex disabilities: Knowledge-based, end-user-focused and evolutionary. Proceedings of the 2ndConference on Communication by Gaze Interaction (COGAIN 2006), Turin, Italy (pp. 83–87). Retrieved from http://www.cogain.org/conference
Donegan, M., Oosthuizen, L., Bates, R., Daunys, G., Hansen, J.P., Joos, M., Majaranta, P., & Signorile, I. (2005). User Requirements Report with Observations of Difficulties Users Are Experiencing. Deliverable D3.1. Communication by Gaze Interaction (COGAIN), Project IST-2003-511598. Retrieved from http://www.cogain.org/wiki/COGAIN_Reports.
Donegan, M., Oozthuizen, L., Daunys, G., Istance, R., Bates, R., Signorile, I., Corno., F., Garbo., A., Farinetti, L., Holmqvist, E., Buchholz, M., Joos, M., Hansen, J.P., MacKay, D., Henriksson, R., & Majaranta, P. (2006a). Report on Features of the Different Systems and Development Needs. Deliverable D3.2. Communication by Gaze Interaction (COGAIN), Project IST-2003-511598. Retrieved from http://www.cogain.org/wiki/COGAIN_Reports.