Calibration

In order to calculate gaze data with high accuracy and precision, the eye tracker firmware needs to adapt the algorithms to the person sitting in front of the tracker. This adaptation is done during the calibration process when the user is looking at points located at known coordinates. The calibration is initiated and controlled by the client application.

The calibration procedure

The calibration of the eye tracker would typically be done as follows:

  1. A small animated object is displayed, to catch the user’s attention.
  2. When it arrives at the calibration point location, the object rests for about 0.5 seconds to give the user a chance to focus on it. Good practice is to shrink the object to help the user focus the gaze on its center, i.e. the calibration point location.
  3. When the user has focused his or her gaze on the calibration point, the eye tracker is told to start collecting data for that specific calibration point.
  4. The eye tracker collects data for the calibration point and sends a notification to the client application when the data collection is completed.
  5. If the object has been shrunk, it is now returned to its original size.
  6. The object is moved to the next calibration point location, usually by using an animation.
  7. Repeat steps 2-6 for all desired calibration points.
  8. The calibration result is computed and reviewed in a calibration plot. (For screen based eye trackers only)
  9. If the data for one or more calibration point is missing or with low accuracy or precision, steps 2-6 and 8 are repeated for those points.
  10. Once the calibration result is satisfactory, the calibration procedure is concluded.
  11. (Optional) After the participant is calibrated, perform a validation of the estimated performance. Read more about Calibration Validation.

The animation in step 1 should not be too fast, nor should the shrinking in step 2 be too fast. Otherwise the user may not be able to get a good calibration result due to the fact that he or she has no time to focus the gaze on the target before the eye tracker starts collecting calibration data.

The normal number of calibration points is 2, 5, or 9. More points can be used but the calibration quality will not increase significantly for more than 9 points. Usually 5 points yields a very good result and is not experienced as too intrusive by the user.

NOTE: (For screen based eye trackers) It is possible to do a calibration without using a screen. The procedure is similar to what is described above, but instead of showing the points on a screen, the points should be presented at the correct locations within the Active Display Area (see section Coordinate systems for more information about the Active Display Area). This is typically done by drawing the points on a piece of cardboard with the equivalent size of the Active Display Area and positioned in the Active Display Area plane provided to the eye tracker when configuring the position of it. The user should then be asked to look at the points in a predefined order and prompted to switch focus from one calibration point to the next.

Configuring the calibration points

Screen based calibration

For screen based eye trackers the calibration points are given in the Active Diplay Coordinate System, and usually shown on a screen.

The location of the calibration points is decided by the client application. A typical calibration pattern for 5 points can be seen below. The coordinates illustrates common locations of the calibration points as expressed in the Active Display Coordinate System.

Typical calibration pattern

NOTE: All points are given in normalized coordinates in such a way that (0.0, 0.0) corresponds to the upper left corner and (1.0, 1.0) corresponds to the lower right corner of the Active Display Area. When choosing the calibration points it is important to consider the following:

  • The calibration points should span an area that is as large or larger than the area where the gaze controlled application or the stimuli will be shown in order to ensure good data.
  • The calibration points must be positioned within the area that is trackable by the eye tracker and be within the Active Display Area.

The calibration state

To be able to perform a calibration the client application must first enter the calibration state. The calibration state is an exclusive state which can only be held by one client at a time. It is entered by calling the Enter Calibration Mode function and is left by calling the Leave Calibration Mode function. Whenever a client enters or leaves the calibration state, a Calibration Mode Entred or Calibration Mode Left event/callback/notification is sent to all clients connected to the same eye tracker, including the calibrating client. These notifications are mostly meant for user interface purposes, like graying out a “Calibrate” button etc. Since the communication with the eye tracker is asynchronous, it is considered to be best practice to use the Enter Calibration Mode function to check whether it reports that another client is currently calibrating before initiating the calibration procedure rather than caching the result of the calibration events.

Some operations can only be performed when in the calibration state, e.g. to collect data, discard data as well as computing and applying a calibration. Other operations such as applying a calibration or retrieving a calibration can be used at any time. However, if the eye tracker is in calibration mode, only the client who set it in that mode can apply a calibration on it.

Applying a calibration

The Compute and Apply function should be called once all calibration points have been shown and data collected. It uses the collected data to calculate an eye model based on the person in front of the eye tracker. The calibration can be recalculated with new input from calibration points until the calibration state is left.

It is possible to save a calibration for a person locally and reapply it at a later time. This is useful if the same person will be using the eye tracker again as you then don't have to go through the entire calibration procedure each time. To get an already active and applied calibration from the eye tracker, call the Retrieve Calibration function. To apply a saved calibration, call to the Apply Calibration function.

NOTE: Before a calibration has been completed successfully, gaze data is already available from the eye tracker. However, the mapping of the gaze data can then be based either on a default eye model or a previous calibration depending on eye tracker model. Hence, this data should only be used as an indication of where a person is looking or for eye position in the user position guide stream.

The table below gives the name in the different languages for the concepts introduced and highlighted in italics in the text above.

Words used in the text .NET/Unity Python Matlab C
Enter Calibration Mode EnterCalibrationMode() enter_calibration_mode() enter_calibration_mode() tobii_research_screen_based_calibration_enter_calibration_mode()
Leave Calibration Mode LeaveCalibrationMode() leave_calibration_mode() leave_calibration_mode() tobii_research_screen_based_calibration_leave_calibration_mode()
Calibration Mode Entered CalibrationModeEntered EYETRACKER_NOTIFICATION_CALIBRATION_MODE_ENTERED id TOBII:NOTIFICATION:calibration_mode_entered TOBII_RESEARCH_NOTIFICATION_CALIBRATION_MODE_ENTERED
Calibration Mode Left CalibrationModeLeft EYETRACKER_NOTIFICATION_CALIBRATION_MODE_LEFT TOBII:NOTIFICATION:calibration_mode_left TOBII_RESEARCH_NOTIFICATION_CALIBRATION_MODE_LEFT
Collect data CollectData() collect_data() collect_data() tobii_research_screen_based_calibration_collect_data()
Discard data DiscardData() discard_data() discard_data() tobii_research_screen_based_calibration_discard_data()
Compute and apply ComputeAndApply() compute_and_apply() compute_and_apply() tobii_research_screen_based_calibration_compute_and_apply()
Applying a calibration ApplyCalibrationData() apply_calibration_data() apply_calibration_data() tobii_research_apply_calibration_data()
Retrieving a calibration RetrieveCalibrationData() retrieve_calibration_data() retrieve_calibration_data() tobii_research_retrieve_calibration_data()

Calibration plots

If you have previous experiences with any of Tobii’s eyetracking products it is very likely that you have seen a calibration plot which is supposed to illustrate the calibration results. The calibration plot is a simple yet concise representation of a performed calibration and it usually looks something like what is shown below. However, the presentation design can vary.

Typical calibration plot

The calibration plot shows the offset between the mapped gaze samples and the calibration points based on the best possible adaptation of the eye model to the collected values done by the eye tracker during calibration. In the image above, the red and green lines represent the offset between the mapped sample points (red for left eye and green for right eye) and the center of where the calibration points were shown. The circles are the actual calibration points. The data displayed in the plot is made available to client applications through the calibration result which, if the calibration was successful, contains a collection of calibration points which in turn contains calibration samples showing where the data was mapped and the position on the display area showing where the calibration point was shown. This allows for implementation of alternative visualizations of calibration results as well as the traditional visualization as seen above.

Words used in the text .NET Python Matlab
Calibration result CalibrationResult CalibrationResult CalibrationResult
Calibration point CalibrationPoint CalibrationPoint CalibrationPoint
Calibration sample CalibrationSample CalibrationSample CalibrationSample
Position on the display area NormalizedPoint2D tuple (uses a standard class) array

Monocular calibration

Some eye trackers, such as the Tobii Pro Spectrum, support calibrations using just one eye. This is useful if one eye causes the regular calibration process to fail. This could happen, for example, due to severe strabismus, or if one eye is a prosthetic. Additionally, research into binocular coordination and fixation disparity requires separate calibrations for the two eyes, often using an eye patch to temporarily cover the eye that is not being calibrated.
Calibrating one eye, or each eye separately, is done similarly to the regular calibration process, with a few minor differences:

  • The monocular calibration can only be done for screen-based eye trackers.
  • Instead of the regular calibration object, another monocular calibration object is used.
  • The function for collecting data requires an additional argument which specifies which eye(s) the data is collected for.
  • Calibration points can be repeated for the other eye without replacing the previously collected data from the first eye.
  • The return status from Compute and Apply will indicate success for either the left, the right, or both eyes, or will return a failure.

Words used in the text .NET Python Matlab C
Calibration object ScreenBasedMonocularCalibration ScreenBasedMonocularCalibration ScreenBasedMonocularCalibration tobii_research_calibration.h
Collecting data CollectData() collect_data() collect_data() tobii_research_screen_based_monocular_calibration_collect_data()
Additional arguments SelectedEye.LeftEye
SelectedEye.RightEye
SelectedEye.BothEyes
tobii_research.SELECTED_EYE_LEFT
tobii_research.SELECTED_EYE_RIGHT
tobii_research.SELECTED_EYE_BOTH
SelectedEye.LEFT
SelectedEye.RIGHT
SelectedEye.BOTH
TOBII_RESEARCH_SELECTED_EYE_LEFT
TOBII_RESEARCH_SELECTED_EYE_RIGHT
TOBII_RESEARCH_SELECTED_EYE_BOTH
Return status CalibrationStatus.Success
CalibrationStatus.SuccessLeftEye
CalibrationStatus.SuccessRightEye
CalibrationStatus.Failure
tobii_research.CALIBRATION_STATUS_SUCCESS
tobii_research.CALIBRATION_STATUS_SUCCESS_LEFT_EYE
tobii_research.CALIBRATION_STATUS_SUCCESS_RIGHT_EYE
tobii_research.CALIBRATION_STATUS_FAILURE
CalibrationStatus.Success
CalibrationStatus.SuccessLeftEye
CalibrationStatus.SuccessRightEye
CalibrationStatus.Failure
TOBII_RESEARCH_CALIBRATION_SUCCESS
TOBII_RESEARCH_CALIBRATION_SUCCESS_LEFT_EYE
TOBII_RESEARCH_CALIBRATION_SUCCESS_RIGHT_EYE
TOBII_RESEARCH_CALIBRATION_FAILURE

Calibration validation

When performing an eye tracking study, it can be very useful to perform a validation of the estimated performance after the calibration. This step is often referred to as Calibration validation. The common procedure of doing a Calibration validation is to show a new set of stimuli points for the participant, collect gaze data during the stimuli presentation, and calculate values for accuracy and precision based on the gaze data's position in relation to the stimuli point (which the participant was expected to look at).

Tobii has published a set of open source add-ons to the Tobii Pro SDK to help with performing a calibration validation. The add-ons include all necessary functions for collecting gaze data and calculating the performance results.

Calibration validation add-ons on GitHub

Calibration and Gaze

It is possible to subscribe to gaze during calibration. This can be used to validate that the user is focusing on the stimuli before starting the data collection for that stimuli point. The application must take into account that the accuracy of the gaze data in this situation is uncalibrated and may be widely off target.

To use this capability, your application should

  • Start gaze subscription before calibration is started, and
  • Maintain the gaze subscription until the calibration is completed.

Intermittently calling subscribe/unsubscribe during the calibration workflow is not supported, and may lead to undefined behaviors.

During calibration, certain characteristics (e.g. frequency) of the gaze signal may change in order to accomodate both the signal subscription and the data collection, depending on the tracker model.

Calibrating with Tobii Pro Eye Tracker Manager

As an alternative to implementing your own calibration stimuli presentation software, you can use Tobii Pro Eye Tracker Manager's calibration feature to perform calibrations. Tobii Pro Eye Tracker Manager is a stand-alone application and you need to manually start it, select eye tracker, and click on start calibration to initiate the calibration procedure. There is also the option of using Tobii Pro Eye Tracker Managers calibration feature directly from the commandline. Passing which tracker to calibrate and which screen to use as parameters. This way you can integrate Tobii Pro Eye Tracker Manager into your workflow without having to start the application manually.