Eye-Tracking Facility

Eye-Tracking Facility


Eye-movements

As we go about our daily lives, our eyes are continually moving around, sampling the information that is available in the visual environment. For instance, if we want to know how somebody is feeling, we will fixate in turn on the different features of their face. When reading a story, our eyes will move from word to word, lingering on words that are more difficult to read, and backtracking when we realise that something does not make sense. When we hear somebody talking, our eyes also tend to be drawn to objects in the visual scene that we expect the speaker to mention next. By monitoring people’s eye-movements, it is possible to gain important insights into the underlying cognitive processes involved in completing these tasks. Eye-tracking also provides a critical objective measure of someone’s focus of visual attention during a particular task.

Facilities in Cognitive Science

The Department of Cognitive Science operates the following eye-tracking systems. Each of these systems has different strengths, enabling a wide-range of applications.

More on the Eyelink Systems

Eye TrackerMountLocation Integrated EquipmentOcularity Sampling Rate
EyeLink 1000 Remote (Desktop) Australian Hearing Hub Standalone Monocular  500 to 1000Hz
EyeLink 1000 Remote (Desktop) Australian Hearing Hub Biopac skin conductance (GSR) system Monocular 500 to 1000Hz
Eyelink 1000 Remote (Desktop) Macquarie University Hospital Standalone Monocular 500 to 1000Hz
EyeLink 1000 Tower ERP Facility, Australian Hearing Hub Neuroscan Synamps 2 EEG system Monocular 1000Hz
EyeLink 1000 Long Range KIT-Macquaire Brain Imaging Lab (MEG), Australian Hearing Hub Adult and Child MEG system Monocular 1000Hz
EyeLink 1000 Long Range Macquarie Medical Imaging (MRI), Macquarie University Hospital fMRI Scanner Monocular 1000Hz
Eyelink II Head Australian Hearing Hub Standalone Binocular 500Hz
ASL EYE-TRAC Head Cognition in Action Facility, Australian Hearing Hub Northern Digital (NDI) Optotrak Monocular 320Hz

EyeLink 1000 remote eye-trackers

The EyeLink remote eye-tracking systems incorporate a small camera and infra-red illuminator mounted on the desktop in front of the display screen. Participants wear a small circular sticker on their forehead, which enables the system to track their head position as they move around freely (around 20cm in any direction). This makes the system particularly suitable for testing children and other special populations. The remote system operates at 500Hz, but can record at 1000Hz and with higher spatial resolution if the participant's head is stabilized using a chin-rest.

Each remote system is by default setup for use with a chin-rest. Change between head-stabilised and free motion configurations requires changing the lens that is installed in the system and needs to be organised with Dr Nathan Caruana in-advance of booking subjects.

EyeLink II head-mounted eye-tracker

The system uses three cameras: two high speed cameras allow for binocular (or dominant-eye monocular) recording of eye-movements; the third camera tracks four infrared markers mounted on the display screen, allowing the EyeLink software to automatically compensate for small head movements by tracking the position of the subject's head in relation to these markers on the screen. This means that a chin rest, or head restraint, is not required. Pupil measurements are recorded at a rate of 500 Hz (one sample every 2 milliseconds). The head-mounted tracker is currently the only system in this department that enables binocular recording. It is particularly suitable for research that involves interaction with a touch-screen. In principle, it could be upgraded (at cost) to allow eye-tracking of interactions with 3-dimensional objects.

EyeLink 1000 tower-mounted system

The tower-mounted EyeLink 1000 system provides the greatest spatial and temporal resolution. To achieve this, the participant's head must be stabilized on a chin-rest. The camera is positioned above the chinrest and monitors their eye-movements via a mirror placed between the participant and the computer screen. The mirror reflects infrared light but visible light passes through, so the participant’s view of the computer screen is not affected. Our tower-mounted eye-tracker is integrated with a Neuroscan EEG system, allowing brain responses to be time-locked to fixation on particular areas of interest on the screen.

Information for Lab Users

Experiment Presentation

The EyeLink systems are compatible with a range of experiment presentation software packages, but it is recommended that new users use the Experiment Builder software from SR Research, the makers of EyeLink. Experiment Builder provides a relatively straightforward Graphical User Interface, allowing researchers with little programming experience to set up an eye-tracking experiment in a matter of hours. Running the Experiment Builder software requires a licence key, which can be booked for a short time using the department's room-booking system. When the experiment is complete, it can be deployed as an executable file, which will then run on any suitable computer (without need for a licence key).

Data Analysis

Eyelink Data Viewer software enables detailed analysis of eye-movements. Users can identify areas of interest within each visual display and produce fixation reports (number of fixations; total duration of fixations; onset time of first, second, third fixation on a particular area of interest). They can also create saccade reports, which provide information about eye-movement ballistics (e.g., saccade onset, peak velocity). Quicktime videos can be exported showing individual trials with eye-movements superimposed.

The Eyelink Data Viewer software can be downloaded freely from the SR Research Support Forums . It will run in a reduced capacity mode unless you have the USB licence key inserted into your Windows or Mac computer.

Projects

Current Projects 2018

  • The Influence of Temporal Attention on Visual Search (A/Prof Anina Rich, Mr Phillip Cheng)
  • A Study of Novel Vocabulary Learning (Dr Lisi Beyesermann, Ms Signy Wegener, Dr Hua-Chen Wang, Prof. Anne Castles)
  • Investigating Implicit Visual Episodic Memory (Ms Haleh Kloshkhouy-Delshad, A/Prof Anina Rich, Prof. Mark Williams)
  • A Multi-letter Processing Intervention in Dyslexia (Ms Iuliia Fokina, Dr Saskia Kohnen, Prof. Genevieve McArthur)
  • Attention, eye movements and perception (Ms Samantha Parker, A/Prof Matthew Finkbeiner)
  • Eye-tracking the Effect of Semantic Decoding in Learning to Read in Chinese (Ms Luan Li, Dr Hua-Chen Wang,  Dr Lili Yu, Dr Eva Marinus, Prof. Anne Castles)
  • A Virtual Reality Study of Social Interaction in Autism and Typical Development (Dr Nathan Caruana, Ms Christine Inkley, Dr David Kaplan, Prof. Genevieve McArthur)
  • Social Cognition in People with Schizophrenia (Ms Colleen Murphy, A/Prof. Robyn Langdon)
  • Learning to Read New Words (Ms Lyndall Murray, Ms Signy Wegener, Prof. Anne Castles, Prof. Rauno Parrila, Dr Hua-Chen Wang)

Academic Contact


Useful Links


Back to top.

Back to Facilities.

Back to the top of this page