Skip to main content
Skip table of contents

Checking if Fixation is Inside a Region of Interest

In this simple guide, we demonstrate how to check in real-time whether a participant is looking inside a user-specified region of the screen (a Region of Interest, or ROI).

This demonstration uses a new module TPxTools.py, which is not yet a part of our software release. It can be downloaded as a standalone module from here: [Download]

To use this module in your Python editor, you must download it, unzip it, and add it to your path. Most IDEs contain a path manager in the Tools or Preferences menu.

You should also verify our full pypixxlib API is installed in PsychoPy.

Regions of Interest

A region of interest, or ROI, is an area on a display that the researcher is interested in monitoring. It could be the location of a target or distractor, a fixation cross, or another visual component.

When the participant looks inside this region, the researcher may wish to record information, or update the display's contents, for example, to begin the subsequent trial.

When a display updates based on the participant’s looking behaviours, this is called a gaze-contingent display or gaze-contingent paradigm. Critically, gaze-contingent displays require online monitoring of participants’ eye position, usually inside a programmatic check loop.

Accessing online eye position

During TRACKPixx3 eye tracker recording, gaze data is acquired and stored on the DATAPixx3 device. While recording is started and stopped via PC commands, the recording itself occurs independent of and in parallel with your PC. This is how we achieve an uninterrupted sampling rate of 2 kHz.

It is possible to bulk import recorded gaze data from the DATAPixx3 onboard memory to your PC and save it for offline analysis. This is what we recommend for data analysis where temporal precision is critical.

However, in some cases, such as with gaze-contingent displays, we need to get the eye position while the experiment is still running. This requires polling the ongoing recording to get the latest recorded data. The function getEyePosition does this:

PY
from pypixxlib import _libdpx as dp

...
dp.DPxUpdateRegCache() #Get latest status of DP3 hardware
eyePosition = dp.TPxGetEyePosition()

getEyePosition returns a list with the following information:

PY
    list: [xScreenLeft, yScreenLeft, xScreenRight, yScreenRight, 
           xLeftRaw, yLeftRaw, xRightRaw, yRightRaw,
           timeStamp]
    
          where "Screen" elements are calibrated eye positions in Cartesian screen pixel coordinates,
          "Raw" elements are uncalibrated corneal-to-pupil-center vectors in radians,
          and "timeStamp" is the DATAPixx time of the samples in seconds 

You can call this command directly and then compute the relevant checks to see whether the participant is making the desired eye movements and fixations. You may also wish to access the fixation and saccade flags to see whether each eye is currently in motion:

PY
from pypixxlib import _libdpx as dp

...
dp.DPxUpdateRegCache() #Get latest status of DP3 hardware
fixatingLeft, fixatingRight = dp.TPxIsSubjectFixating() #Returns true if fixation criteria met
saccadeLeft, saccadeRight = TPxIsSubjectMakingSaccade() #Returns true if saccade criteria met

So for instance, in our checking loop in a gaze-contingent paradigm, we may assess the following:

  • Is the participant fixating

  • If yes, are they fixating inside the region of interest

  • If yes to both, then update the display

The first two steps of this process are automated by the TPxTools function isInROI, which accepts a rectangular area and the eye to monitor, and returns true if the participant is presently fixating in that region. In the following section, we provide a simple demo showcasing this function in a gaze-contingent experiment.

Demo: Look at the target to continue

In the following demo, a series of targets appear in random positions on the screen. The experiment proceeds when the participant successfully fixates on the target location.

The code uses TPxTools to calibrate the eye tracker, then uses isInROI to see if the participant looks inside the target area before continuing to the subsequent trial. We use the marker system to get the exact moment the screen was updated with the target, and the moment when the isInROI command returns true. This gives us an approximate measure of the reaction time of the participant.

PY
from psychopy import visual, core
import random
import TPxTools as tp
from pypixxlib import _libdpx as dp

def drawRandomDot(win):
    # Get the window size
    screen_width, screen_height = win.size
    
    # Define the inner 80% region
    inner_width = screen_width * 0.8
    inner_height = screen_height * 0.8
    
    # Define dot properties
    dot_diameter = 40
    dot_radius = dot_diameter / 2
    
    # Compute random x, y within the inner 80%
    x_min = -inner_width / 2 + dot_radius
    x_max = inner_width / 2 - dot_radius
    y_min = -inner_height / 2 + dot_radius
    y_max = inner_height / 2 - dot_radius
    
    x_pos = random.uniform(x_min, x_max)
    y_pos = random.uniform(y_min, y_max)
    
    # Draw the dot
    dot = visual.Circle(win, radius=dot_radius, fillColor='white', lineColor='white', pos=(x_pos, y_pos))
    dot.draw()
    dp.DPxSetMarker()
    dp.DPxWriteRegCacheAfterVideoSync()
    win.flip()
    
    # Return the bounding box coordinates
    rect_coords = (x_pos - dot_radius, y_pos - dot_radius, x_pos + dot_radius, y_pos + dot_radius)
    return rect_coords

     
#MAIN SCRIPT 
###############################################################################
# Step 1: Open a windowPtrdow on screen specified by 'screenNumber'
windowPtr = visual.Window(color='black',screen=1, fullscr=True, units='pix') 

# Step 2: Call a calibration
calib = tp.initialCalibration(windowPtr)

# Step 3: Present experiment instructions
message = visual.TextStim(windowPtr, text="Look at the dots!", pos=(0, 0), color=(1, 1, 1))
message.draw()
windowPtr.flip()
  
core.wait(4)

##Step 4: Trial loop----------------------------------------------------------------------
numTrials = 5
for trial in range(numTrials):
    rect = drawRandomDot(windowPtr)
    
    # Wait until subject fixates inside target region
    while not tp.isInROI(rect):
        core.wait(0.01)
        
    # Access time when target was drawn
    startTime = dp.DPxGetMarker()
    
    # Access time of last register update, i.e., during the ROI check
    endTime = dp.DPxGetTime()
    
    #Calculate delay
    timeToLook = endTime - startTime
    timeToLook_formatted = f"{timeToLook:.2g}"
    
    print(f"Trial {trial + 1}: Target fixated within {timeToLook_formatted} seconds!")
    
    windowPtr.flip()
    core.wait(0.5)

##Continue experiment here-----------------------------------------------------

## Step 5: Close the window
windowPtr.close()
tp.shutdownTracker()
core.quit()

Video demonstration:

Note this video skips the calibration. For more on the calibration tools offered in TPxTools, see TRACKPixx3 calibration and data collection routine in Python IDE.

In this video, eye position is simulated using the LabMaestro Simulator and the user’s mouse. Hovering the mouse over the target is equivalent to a participant fixating the target. For more information about how to simulate eye tracking with the mouse, see Eye Tracking

The temporal precision of real-time polling of eye position

If we can access gaze data in real time on the PC, why bother with parallel recording on the DATAPixx3 at all? The reason is temporal precision. Polling the DATAPixx3 hardware takes time, and introduces a slight delay (~0.4 ms) between when the eye has moved and when the PC reports eye position. In other words, eye position accessed by the PC is slightly out-of-date. Depending on what else your code is doing, the delay could be even longer and highly variable, making eye position data accessed via this method temporally imprecise.

This delay and variability are absent in the continuously recorded data saved on our hardware. Therefore, we strongly suggest that offline analysis use the data saved directly on and imported from the DATAPixx3 (i.e., data managed via the start, stop and saveTPx data commands).

JavaScript errors detected

Please note, these errors can depend on your browser setup.

If this problem persists, please contact our support.