Skip to main content
Skip table of contents

Using VPixx Hardware with PsychoPy

In this VOCAL, we will cover how to use VPixx Technologies’ software tools for Python, with an emphasis on using it with the PsychoPy suite of tools for scientists.

If you are new to PsychoPy, we encourage you to check out their Resources for Learning and Teaching webpage to learn more about using their system. In general, PsychoPy offers two methods of working with its tools: 

  • Use PsychoPy’s Builder interface to graphically design an experiment, 
    supplementing their built-in tools with .csv conditions files and custom code blocks. 

  • Access the PsychoPy library into your Python Integrated Development Environment (IDE), like PsychoPy Coder, Spyder or PyCharm.

Left: The PsychoPy Builder interface with the custom code block editor open. Right: Spyder 5, a popular Python IDE

VPixx Technologies has a library of Python functions for our data acquisition systems called pypixxlib. This library lets users interact directly with our hardware in their Python experiment code. Pypixxlib is the Python equivalent of our Datapixx commands for MATLAB/Psychtoolbox, which you can read more about here: Introduction to VPixx MATLAB documentation.

For a full list of our Python functions and some IDE-based examples, please check out our pypixxlib documentation. There, you will also find instructions on installing our Python tools in the Getting Started section. Common Python troubleshooting problems and their solutions are documented in our Frequently Asked Questions – Troubleshooting.

This VOCAL guide aims to orient researchers to using pypixxlib with PsychoPy. We will start with a general introduction to the two ‘flavours’ of pypixxlib and then discuss how to implement our library within the Builder and in a Python IDE. We will then cover some general tips and recommendations for working with PsychoPy.

If you are using a VIEWPixx /EEG and/or your primary goal is to generate frame-accurate digital TTL triggers, we have a fully automated solution called Pixel Mode. If you use Pixel Mode, you do not need to invoke pypixxlib in your experiment. However, we still recommend you read the Tips section of this guide for helpful information about drawing pixels in PsychoPy so triggers are correctly sent.

For most other applications, you will most certainly have to write some code. This is true even if you are using the Builder; there are no default drag-and-drop tools to support VPixx hardware, so you will need to use custom code blocks to interact with our systems. This guide provides some examples to help you get started. 

The two flavours of pypixxlib

There are two general ways to interact with pypixxlib, reflecting two different programming styles. Which approach you decide to take depends on your personal preference, previous programming experience, and general comfort level with coding. 

Object-oriented programming (OOP)

OOP groups code elements into classes with unique attributes (properties) and functions (behaviours). Broadly defined parent classes may also have child classes that contain more specific attributes and functions. For example, in PsychoPy the parent class visual contains useful attributes and functions shared by all visual elements. It also has a child class line with attributes and functions related to creating lines.

In OOP, specific instances of classes, called objects, are created and strategically called in the code. Pypixxlib has classes for all of our devices. We also have specific I/O subclasses for each of our different kinds of data acquisition and control (including audio inputs, audio outputs, digital inputs, etc.) that can be called by a device object.

Here’s a simple example: We create an instance of a DATAPixx3 object and then call the setVolume() function of its audio subclass.

PY
from pypixxlib.datapixx import DATAPixx3
myDevice = DATAPixx3()
myDevice.open()
myDevice.audio.setVolume(0.5)
myDevice.writeRegisterCache()

OOP code tends to be elegant and efficient, but it takes time to conceptualize. If you’re interested in learning more about this programming style, many resources are available on the web.

Wondering about the final line of code in the example “writeRegisterCache?” We will cover this in more detail in the next section of the guide. 

Procedural programming

Procedural programming is a style of programming in which individual functions are called step-by-step to build a program. Procedural code is a linear set of instructions, like a recipe. The downside to this approach is that it is not easy to modify or invoke previously implemented code, so occasionally, procedural code is less efficient.

Our tools for MATLAB are procedural, following the example of Psychtoolbox. Initially, pypixxlib maintained the procedural format, and the OOP tools were added later. The procedural functions can be found in a subsection of pypixxlib called libdpx. Procedural libdpx commands have a prefix, usually DPx.

Here’s the same example as before of setting audio volume, using libdpx procedural code this time: 

PY
from pypixxlib import _libdpx as dp
dp.DPxOpen()
dp.DPxSetAudVolume(0.5)
dp.DPxWriteRegCache()

Libdpx commands can be used across all VPixx devices supported by our software tools. Some device-specific functions are prefaced with variations of the DPx prefix. For example, libdpx commands for our eye-tracking systems are prefaced with TPx

If you have more than one VPixx device connected by USB, such as a PROPixx projector and a PROPixx controller, libdpx commands will target the appropriate device based on the nature of the command. In the example above, DPxSetAudVolume would be sent to the PROPixx controller, as this device manages audio. If you want to force libdpx to target a specific device, use the command DPxSelectDevice and pass the device type or name as an argument, followed by your libdpx command.

Procedural programming is much more intuitive for beginner programmers. The libdpx commands also closely follow the format of our MATLAB/Psychtoolbox functions, so if you are familiar with our MATLAB tools already, libdpx will feel very similar.

Ultimately, the decision of which strategy to use, OOP or procedural, is up to the user’s preferences. Both provide a solid foundation for working with our Python tools.

A quick review of VPixx’s register writing and updating approach

API control over VPixx devices uses a method called register writing and register updating. When you implement a line of code to control our hardware, this command will not execute immediately. Instead, it waits for a special command that indicates it is time to execute. This command is called a register write or a register update.  

PY
dp.DPxSetAudVolume(0.5)
dp.DPxWriteRegCache()

Register writes allow you to execute several queued device commands simultaneously. This execution can be tied directly to the behaviour of your display; VPixx has a series of special register write functions that delay the time of write execution until a specific event in the video signal is detected. This allows you to synchronize your hardware commands with your video frames or visual stimulus onset.

Register updates follow the same principle as register writes, but they also return a copy of the device status. This is useful for getting up-to-date information from our hardware, such as the system clock time and the current state of the I/O ports.

You can read more about our register system in our VOCAL The Logic of VPixx Hardware Control. We highly recommend this guide for researchers interested in precise timing control in their experiments.

For now, the key points to remember are:

  • With a handful of exceptions, most pypixxlib commands must be followed by a register write or an update for your hardware to execute the requested changes.

  • A register write executes all queued hardware commands. It should be used in situations where a fast turnaround time is needed, and no data is returned from the device.

  • A register update (aka a write-read) performs a write and returns a copy of the device status to your PC. It should be used when you need a fresh copy of the status of your hardware.

It is a good idea to remember this register system when troubleshooting your experiment code. Is your hardware not behaving as expected? You might want to check to make sure you included a register write. Does information retrieved from the device seem outdated or missing? You might want to check when your last register update was and whether you have up-to-date device information.

Invoking pypixxlib in PsychoPy Builder

Our tools are not natively supported in PsychoPy’s Builder interface. This means that when you need to interact with our hardware, you will have to use custom code blocks strategically placed within the experiment flow. These blocks can use either OOP or procedural-style code.

While custom code might sound daunting to new experiment designers, remember that these code blocks don’t need to be overly complicated. We offer many examples below that you can copy or tweak as needed. These code blocks can be reused across multiple experiments accessing the same equipment.

Create custom code blocks in PsychoPy Builder from the Components > Custom > Code Icon

Custom code blocks can be inserted as elements into specific routines in your Experiment Flow. Code blocks also have an additional feature allowing you to specify where in the experiment you would like them to occur. 

A blank code block editor

Below are some common examples of code which might appear in each one of these positions:

  • Before Experiment: Connect to VPixx hardware, apply settings changes 

  • Begin Experiment: Apply settings changes, upload waveforms into the VPixx device RAM for playback during experiment (e.g., audio files, digital TTL waveforms, custom analog waveforms), turn on any recording schedules that will be on for the entire experiment (e.g., eye tracking).

  • Begin Routine: Set up and turn on input recording (e.g., listening for button activity), set up output for immediate playback, or playback with a video-based event (e.g., play audio at the same time as a visual stimulus, send a custom TTL waveform on stimulus onset)

  • Each Frame: Routinely check hardware for important status changes (like a button press being recorded, or the participant fixating a target). Note: frame intervals are typically <17 ms, so you only have a brief window of time to execute this code; try to keep your code to a bare minimum here.

  • End Routine: Import recorded data and evaluate it, save any important timestamps, disable recording or playback if needed

  • End Experiment: Shut down any ongoing recordings, restore hardware defaults, close connection to VPixx hardware

Some examples are given below.

Whenever you insert a code block, remember to ensure the Code Type option selected is “Py” or Python.

Examples

Click on the sections below to expand the example.

Looking for a specific example? Most of our demos are based on customer suggestions. If you are looking for support with a specific feature, and can’t find an example in our VOCAL or demo libraries, contact our team and suggest a new demo. Your request can help the greater VPixx user community!

Establishing a connection to VPixx hardware

Must be called at the beginning of every experiment

CODE
#Code block best positioned "Before Experiment" so any problems are caught right away
#import our library and open the device connection
#throw an error if device is not connected properly
from pypixxlib import _libdpx as dp

dp.DPxOpen()
isReady = dp.DPxIsReady()
if not isReady:
    raise ConnectionError('VPixx Hardware not detected! Check your connection and try again.')

Enabling Pixel Mode and creating a Pixel Trigger

Pixel Mode is a method of sending automated TTL triggers locked to the onset of a visual stimulus. Read more about this mode here: Sending Triggers with Pixel Mode .

This block enables Pixel Mode on our data acquisition systems (note: it is on by default for the VIEWPixx /EEG). It also creates a helper command called “DrawPixelModeTrigger” which you can then call just before the frame where you would like the trigger to fire. The last few lines can be pasted in a code block just before the target video frame; these will generate your custom trigger. 

PY
#Code block best positioned "Before Experiment" or "Begin Experiment"
#Assumes library has been imported and device is connected

#Configure hardware to be in Pixel Mode
dp.DPxEnablePixelMode()
dp.DPxWriteRegCache()

#Helper function to draw pixel trigger
def drawPixelModeTrigger(win, pixelValue):
    #takes a pixel colour and draws it as a single pixel in the top left corner of the window
    #window must cover top left of screen to work
    #interpolate must be set to FALSE before color is set
    #call this just before flip to ensure pixel is drawn over other stimuli
    topLeftCorner = [-win.size[0]/2, win.size[1]/2]
    line = visual.Line(
            win=win,
            units = 'pix',
            start=topLeftCorner,
            end=[topLeftCorner[0]+1, topLeftCorner[1]],
            interpolate = False,
            colorSpace = 'rgb255',
            lineColor = pixelValue)
    line.draw()
    
#Uncomment and paste the following in a code block just before your target video frame to draw your pixel trigger.
#There should be no delay between this code and your target video frame, so if you are drawing other stimuli make 
#sure to draw them right away. You may want to start a routine where stimuli onset occurs at t0 and add this code 
#block in the "Begin Routine" phase
#myTriggerValue = 33
#myPixelValue = dp.DPxTrigger2RGB(myTriggerValue)
#drawPixelModeTrigger(win, myPixelValue)
Loading an audio waveform into device memory

Save your audio file to VPixx hardware memory for playback during the experiment. 

PY
#Code block best positioned "Begin Experiment"
#Assumes libdpx has been imported and device is connected
#Requires scipy for importing audio files

from scipy.io import wavfile

#file path to sound file
soundFile = 'C:/.../myfile.wav'

#import sound file into Python
fs, audio_data = wavfile.read(soundFile)
maxScheduleFrames = len(audio_data) 

#some settings
volume = 0.5 #50 percent
bufferAddress = int(16e6) 
onsetDelay = 0.0 
stereoMode = 'mono'

dp.DPxInitAudCodec()
dp.DPxWriteAudioBuffer(audio_data, bufferAddress)
dp.DPxSetAudioSchedule(onsetDelay, fs, maxScheduleFrames, stereoMode, bufferAddress)    
dp.DPxWriteRegCache()

Setting up Button Schedules

This mode allows you to define unique TTL waveforms for each of your RESPONSEPixx buttons and save them on the hardware. Button presses will then immediately trigger playback of these waveforms on the acquisition system’s digital output. Use this method to pass button activity to other hardware (MEG, EEG). To learn more about button forwarding, see Forwarding RESPONSEPixx Button Activity to a Third-Party Receiver

PY
#Code block best positioned "Beginning of Experiment" when parameters are being set
#The schedule will work automatically, including catching any accidental presses
#Assumes library has been imported and device is connected

#Enable debounce. When a DIN transitions, ignore further DIN transitions for next 30 milliseconds 
#(good for response buttons and other mechanical inputs)
dp.DPxEnableDinDebounce()

#Set our mode. The mode can be:
#  0 -- The schedules starts on a raising edge (press of RPx /MRI, release of RPx)
#  1 -- The schedules starts on a falling edge (release of RPx /MRI, press of RPx)
#  2 -- The schedules starts on a raising and on a falling edge (presses and releases, both RPx types)
# For mode 0 and 1, you put the schedule at baseAddr + 4096*DinValue
# For mode 2, you put the schedule of a falling edge at baseAddr + 4096*DinValue + 2048*DinValue 
# and a rising edge at baseAddr + 4096*DinValue + 2048
#Not sure what DinValues correspond to which buttons? Have special wiring? 
#Check directly by using out PyPixx > Digital I/O demo and pressing buttons

dp.DPxSetDoutButtonSchedulesMode(0)
signalLength = 6
baseAddress = int(9e6)

#Red button (DinValue 0)
redSignal = [1, 0, 0, 0, 0, 0] #single pulse on dout 0
redAddress =  baseAddress + 4096*0
dp.DPxWriteDoutBuffer(redSignal, redAddress)

#Yellow button (DinValue 1)
yellowSignal = [1, 0, 1, 0, 0, 0] #two pulses on dout 0
yellowAddress =  baseAddress + 4096*1
dp.DPxWriteDoutBuffer(yellowSignal, yellowAddress)

#Green button (DinValue 2)
greenSignal = [2, 0, 0, 0, 0, 0] #single pulse on dout 1
greenAddress =  baseAddress + 4096*2
dp.DPxWriteDoutBuffer(greenSignal, greenAddress)

#Blue button (DinValue 3)
blueSignal = [2, 0, 2, 0, 0, 0] #two pulses on dout 1
blueAddress =  baseAddress + 4096*3
dp.DPxWriteDoutBuffer(blueSignal, blueAddress)

scheduleOnset = 0.0 #no delay
scheduleRate = 2 #waveform playback rate 2 samples/sec

dp.DPxSetDoutSchedule(scheduleOnset, scheduleRate, signalLength+1, baseAddress)
dp.DPxEnableDoutButtonSchedules()
dp.DPxWriteRegCache()
Setting up and starting a RESPONSEPixx listener

RESPONSEPixx button box activity is recorded on the digital input of the data acquisition system. We use a special schedule called a digital input log to record only changes in the state of this port, signalling either a button press or a release.

A RESPONSEPixx Listener is a special class developed for our OOP library that encapsulates all of the commands required to set up and collect button activity from the digital input log. While it is not necessary to use a Listener to monitor button activity, this tool streamlines button box monitoring.  The RPxButtonListener begins recording data as soon as it is created. It can listen for button presses, button releases, or both. 

PY
#Code block best positioned "Beginning of Routine" at either start of a block or a trial
#depending on how continuously you want to record data
#Assumes library has been imported and device is connected
from pypixxlib import responsepixx as rp

#Create button listener, and pass button box type as argument. 
# Accepted values (not case sensitive):
#     1) 'handheld'
#     2) 'handheld - mri'
#     3) 'dual - mri'
#     4) 'dual handheld'
#     5) 'mri 10 button'

#Using a custom button box and need a hand? Contact our tech support team (support@vpixx.com)
listener = rp.ButtonListener("dual - mri") 

Checking for a RESPONSEPixx button press

Once you have a ButtonListener running (see previous example), you can check for new button activity strategically during your experiment. Below is an example of log checking using an instance of the RPxButtonListener class.

Timestamps reported are VPixx hardware clock. If you want to compare the button event timing to some other event, like visual stimulus onset, you can take frame-accurate timestamps using the code in example 8. Do not equate PsychoPy system time and VPixx hardware time. Separate clock systems can drift and desynchronize over time, making timing comparisons inaccurate.

PY
#Code block best positioned "End of Routine" where routine is a trial 
#Reports all new button activity since the listener was instantiated OR the last call to updateLogs()
#Requires VPixx hardware to be connected, and an RPxButtonListener must be instantiated 
import numpy as np

#This follows the example above, using an dual - mri button box. 
#Logged button events take the format [timetag, buttonID, eventType]
    # timetag is the time in seconds since your VPixx hardware last booted up
    # buttonID is the button that was active. Below is a list of button IDs to help interpret code:  
        # "handheld" button IDs: {0:'red',1:'yellow',2:'green',3:'blue',4:'white'}
        # "handheld - mri" button IDs: {0:'red',1:'yellow',2:'green',3:'blue'}
        # "dual - mri" button IDs: {0:'red',1:'yellow',2:'green',3:'blue'}
        # "dual handheld" button IDs: {0:'red',1:'yellow',2:'green',3:'blue'}
        # "mri 10 button" button IDs: {0:'red left',1:'yellow left',2:'green left',3:'blue left',
        #                               4:'white left',5:'red right',6:'yellow right',7:'green right',
        #                               8:'blue right',9:'white right'}
    # eventType can be either "Push" or "Release"
#If more than one event is logged, they will all be reported in the output.

#Only listen for red or yellow pushes. You can alter these variables to listen for different events
buttonSubset = [0,1] 
recordPushes = True
recordReleases = False

# update the Din logs for any new activity
listener.updateLogs() 
output = listener.getNewButtonActivity(buttonSubset, recordPushes, recordReleases)

#This bit of code saves the logged events to a .csv file in the working directory. 
#If you call it repeatedly, it will append any new data to the same file.
#You may want to change the file name between blocks or participants to keep data organized.     
myFileName = 'RPxData.csv'
with open(myFileName, 'a') as RPxData:
    #save output
    if np.size(output)>0:
            np.savetxt(RPxData, output, delimiter=",", fmt='%s', comments="")
Closing the connection to hardware

A simple cleanup code block to run at the end of your experiment.

PY
#Code block best positioned "After Experiment"   
#Assumes library has been imported and device is connected
dp.DPxStopAllScheds()
dp.DPxWriteRegCache() 
dp.DPxClose() 

Calling pypixxlib commands in a Python IDE

Pypixxlib functions like other site-packages in a Python environment. Once you have downloaded our software tools and installed pypixxlib, you can import pypixxlib at the top of your script just like you would other packages (numpy, psychopy, matplotlib, etc).

The code snippets in the previous section of this guide provide examples of importing our object-oriented tools or libdpx. The demo section of our pypixxlib documentation also contains several IDE-based demos.

General tips for working with our tools and PsychoPy

Pixel Identity Passthrough

When working with VPixx tools, certain applications require pixel identity passthrough. This refers to the 1-to-1 mapping between a visual stimulus created in software and the stimulus presented on the display. This is most critical for:

You can learn more about pixel identity passthrough, factors disrupting it, and consequences for our tools here: What is Pixel Identity Passthrough?

The same recommendations in that guide apply to using our products in PsychoPy (disabling dithering, being mindful of gamma corrections, etc.) In addition, here are some specific tips for ensuring pixel identity passthrough in PsychoPy, both in Builder and an IDE:

  • Use RGB255 colour space. The standard PsychoPy colour space (-1 to 1)  rounds 8-bit colour values, causing a mismatch between pixel value assignment and detected output. In Builder, you may need to create your triggers and pixel sequences using code blocks rather than relying on drag-and-drop objects, which don’t support RGB255.

  • Set interpolation to False. The ImageStim class contains an argument called ‘interpolation,’ which effectively antialiases the edges of the stimulus. This property should be set to False for any ImageStim used by our special features to avoid pixel value transformations. For an example implementation, see the code in the section Enabling Pixel Mode and creating a Pixel Trigger.

Coordinate system for the TRACKPixx3

The TRACKPixx3 uses a Cartesian coordinate system, where the units are in pixels. For example, on a 1920 x 1080 display, the bottom right corner pixel is at [960, -540].

PsychoPy also uses a Cartesian coordinate system, but the default units are normalized. For instance, their bottom right corner value will be [1,-1] by default. We recommend switching your PsychoPy window units to ‘pix’ or pixels to ensure good agreement between these two systems.

Special video modes and sequencers

Our special high-bit-depth video modes and high-refresh-rate sequencers can be used in Python. However, we recommend using an IDE, not the Builder, for these modes. Most of our video modes require custom shaders or image formatting that cannot be easily incorporated into the Builder’s pre-defined window properties. 

For an example of this formatting and how it can be automated, we have an example of 480Hz or “Quad4x” mode available here: [download]. More examples using our custom shaders will be made available soon.

Still have questions? Pypixxlib code not working? Looking for a specific demo? Contact our technical support staff at support@vpixx.com. Our team of trained vision scientists and software developers can help you get our tools up and running.   

References

Peirce, J. W., Gray, J. R., Simpson, S., MacAskill, M. R., Höchenberger, R., Sogo, H., Kastman, E., Lindeløv, J. (2019). PsychoPy2: experiments in behavior made easy. Behavior Research Methods. 10.3758/s13428-018-01193-y

JavaScript errors detected

Please note, these errors can depend on your browser setup.

If this problem persists, please contact our support.