Using VPixx Hardware with PsychoPy
In this VOCAL, we will cover how to use VPixx Technologies’ software tools for Python, with an emphasis on using it with the PsychoPy suite of tools for scientists.
If you are new to PsychoPy, we encourage you to check out their Resources for Learning and Teaching webpage to learn more about using their system. In general, PsychoPy offers two methods of working with its tools:
Use PsychoPy’s Builder interface to graphically design an experiment,
supplementing their built-in tools with .csv conditions files and custom code blocks.Access the PsychoPy library into your Python Integrated Development Environment (IDE), like PsychoPy Coder, Spyder or PyCharm.
VPixx Technologies has a library of Python functions for our data acquisition systems called pypixxlib. This library lets users interact directly with our hardware in their Python experiment code. Pypixxlib is the Python equivalent of our Datapixx commands for MATLAB/Psychtoolbox, which you can read more about here: Introduction to VPixx MATLAB documentation.
For a full list of our Python functions and some IDE-based examples, please check out our pypixxlib documentation. There, you will also find instructions on installing our Python tools in the Getting Started section. Common Python troubleshooting problems and their solutions are documented in our Frequently Asked Questions – Troubleshooting.
This VOCAL guide aims to orient researchers to using pypixxlib with PsychoPy. We will start with a general introduction to the two ‘flavours’ of pypixxlib and then discuss how to implement our library within the Builder and in a Python IDE. We will then cover some general tips and recommendations for working with PsychoPy.
If you are using a VIEWPixx /EEG and/or your primary goal is to generate frame-accurate digital TTL triggers, we have a fully automated solution called Pixel Mode. If you use Pixel Mode, you do not need to invoke pypixxlib in your experiment. However, we still recommend you read the Tips section of this guide for helpful information about drawing pixels in PsychoPy so triggers are correctly sent.
For most other applications, you will most certainly have to write some code. This is true even if you are using the Builder; there are no default drag-and-drop tools to support VPixx hardware, so you will need to use custom code blocks to interact with our systems. This guide provides some examples to help you get started.
The two flavours of pypixxlib
There are two general ways to interact with pypixxlib, reflecting two different programming styles. Which approach you decide to take depends on your personal preference, previous programming experience, and general comfort level with coding.
Object-oriented programming (OOP)
OOP groups code elements into classes with unique attributes (properties) and functions (behaviours). Broadly defined parent classes may also have child classes that contain more specific attributes and functions. For example, in PsychoPy the parent class visual contains useful attributes and functions shared by all visual elements. It also has a child class line with attributes and functions related to creating lines.
In OOP, specific instances of classes, called objects, are created and strategically called in the code. Pypixxlib has classes for all of our devices. We also have specific I/O subclasses for each of our different kinds of data acquisition and control (including audio inputs, audio outputs, digital inputs, etc.) that can be called by a device object.
Here’s a simple example: We create an instance of a DATAPixx3 object and then call the setVolume() function of its audio subclass.
from pypixxlib.datapixx import DATAPixx3
myDevice = DATAPixx3()
myDevice.open()
myDevice.audio.setVolume(0.5)
myDevice.writeRegisterCache()
OOP code tends to be elegant and efficient, but it takes time to conceptualize. If you’re interested in learning more about this programming style, many resources are available on the web.
Wondering about the final line of code in the example “writeRegisterCache?” We will cover this in more detail in the next section of the guide.
Procedural programming
Procedural programming is a style of programming in which individual functions are called step-by-step to build a program. Procedural code is a linear set of instructions, like a recipe. The downside to this approach is that it is not easy to modify or invoke previously implemented code, so occasionally, procedural code is less efficient.
Our tools for MATLAB are procedural, following the example of Psychtoolbox. Initially, pypixxlib maintained the procedural format, and the OOP tools were added later. The procedural functions can be found in a subsection of pypixxlib called libdpx. Procedural libdpx commands have a prefix, usually DPx.
Here’s the same example as before of setting audio volume, using libdpx procedural code this time:
from pypixxlib import _libdpx as dp
dp.DPxOpen()
dp.DPxSetAudVolume(0.5)
dp.DPxWriteRegCache()
Libdpx commands can be used across all VPixx devices supported by our software tools. Some device-specific functions are prefaced with variations of the DPx prefix. For example, libdpx commands for our eye-tracking systems are prefaced with TPx.
If you have more than one VPixx device connected by USB, such as a PROPixx projector and a PROPixx controller, libdpx commands will target the appropriate device based on the nature of the command. In the example above, DPxSetAudVolume would be sent to the PROPixx controller, as this device manages audio. If you want to force libdpx to target a specific device, use the command DPxSelectDevice and pass the device type or name as an argument, followed by your libdpx command.
Procedural programming is much more intuitive for beginner programmers. The libdpx commands also closely follow the format of our MATLAB/Psychtoolbox functions, so if you are familiar with our MATLAB tools already, libdpx will feel very similar.
Ultimately, the decision of which strategy to use, OOP or procedural, is up to the user’s preferences. Both provide a solid foundation for working with our Python tools.
A quick review of VPixx’s register writing and updating approach
API control over VPixx devices uses a method called register writing and register updating. When you implement a line of code to control our hardware, this command will not execute immediately. Instead, it waits for a special command that indicates it is time to execute. This command is called a register write or a register update.
dp.DPxSetAudVolume(0.5)
dp.DPxWriteRegCache()
Register writes allow you to execute several queued device commands simultaneously. This execution can be tied directly to the behaviour of your display; VPixx has a series of special register write functions that delay the time of write execution until a specific event in the video signal is detected. This allows you to synchronize your hardware commands with your video frames or visual stimulus onset.
Register updates follow the same principle as register writes, but they also return a copy of the device status. This is useful for getting up-to-date information from our hardware, such as the system clock time and the current state of the I/O ports.
You can read more about our register system in our VOCAL The Logic of VPixx Hardware Control. We highly recommend this guide for researchers interested in precise timing control in their experiments.
For now, the key points to remember are:
With a handful of exceptions, most pypixxlib commands must be followed by a register write or an update for your hardware to execute the requested changes.
A register write executes all queued hardware commands. It should be used in situations where a fast turnaround time is needed, and no data is returned from the device.
A register update (aka a write-read) performs a write and returns a copy of the device status to your PC. It should be used when you need a fresh copy of the status of your hardware.
It is a good idea to remember this register system when troubleshooting your experiment code. Is your hardware not behaving as expected? You might want to check to make sure you included a register write. Does information retrieved from the device seem outdated or missing? You might want to check when your last register update was and whether you have up-to-date device information.
Invoking pypixxlib in PsychoPy Builder
Our tools are not natively supported in PsychoPy’s Builder interface. This means that when you need to interact with our hardware, you will have to use custom code blocks strategically placed within the experiment flow. These blocks can use either OOP or procedural-style code.
While custom code might sound daunting to new experiment designers, remember that these code blocks don’t need to be overly complicated. We offer many examples below that you can copy or tweak as needed. These code blocks can be reused across multiple experiments accessing the same equipment.
Create custom code blocks in PsychoPy Builder from the Components > Custom > Code Icon
Custom code blocks can be inserted as elements into specific routines in your Experiment Flow. Code blocks also have an additional feature allowing you to specify where in the experiment you would like them to occur.
Below are some common examples of code which might appear in each one of these positions:
Before Experiment: Connect to VPixx hardware, apply settings changes
Begin Experiment: Apply settings changes, upload waveforms into the VPixx device RAM for playback during experiment (e.g., audio files, digital TTL waveforms, custom analog waveforms), turn on any recording schedules that will be on for the entire experiment (e.g., eye tracking).
Begin Routine: Set up and turn on input recording (e.g., listening for button activity), set up output for immediate playback, or playback with a video-based event (e.g., play audio at the same time as a visual stimulus, send a custom TTL waveform on stimulus onset)
Each Frame: Routinely check hardware for important status changes (like a button press being recorded, or the participant fixating a target). Note: frame intervals are typically <17 ms, so you only have a brief window of time to execute this code; try to keep your code to a bare minimum here.
End Routine: Import recorded data and evaluate it, save any important timestamps, disable recording or playback if needed
End Experiment: Shut down any ongoing recordings, restore hardware defaults, close connection to VPixx hardware
Some examples are given below.
Whenever you insert a code block, remember to ensure the Code Type option selected is “Py” or Python.
Examples
Click on the sections below to expand the example.
Looking for a specific example? Most of our demos are based on customer suggestions. If you are looking for support with a specific feature, and can’t find an example in our VOCAL or demo libraries, contact our team and suggest a new demo. Your request can help the greater VPixx user community!
Calling pypixxlib commands in a Python IDE
Pypixxlib functions like other site-packages in a Python environment. Once you have downloaded our software tools and installed pypixxlib, you can import pypixxlib at the top of your script just like you would other packages (numpy, psychopy, matplotlib, etc).
The code snippets in the previous section of this guide provide examples of importing our object-oriented tools or libdpx. The demo section of our pypixxlib documentation also contains several IDE-based demos.
General tips for working with our tools and PsychoPy
Pixel Identity Passthrough
When working with VPixx tools, certain applications require pixel identity passthrough. This refers to the 1-to-1 mapping between a visual stimulus created in software and the stimulus presented on the display. This is most critical for:
Pixel Mode (see Sending Triggers with Pixel Mode)
Register writes or updates on Pixel Sync (see Using Pixel Sync for Stimulus-Accurate Timing)
Certain video modes (see High-Bit-Depth Video Modes)
You can learn more about pixel identity passthrough, factors disrupting it, and consequences for our tools here: What is Pixel Identity Passthrough?
The same recommendations in that guide apply to using our products in PsychoPy (disabling dithering, being mindful of gamma corrections, etc.) In addition, here are some specific tips for ensuring pixel identity passthrough in PsychoPy, both in Builder and an IDE:
Use RGB255 colour space. The standard PsychoPy colour space (-1 to 1) rounds 8-bit colour values, causing a mismatch between pixel value assignment and detected output. In Builder, you may need to create your triggers and pixel sequences using code blocks rather than relying on drag-and-drop objects, which don’t support RGB255.
Set interpolation to False. The ImageStim class contains an argument called ‘interpolation,’ which effectively antialiases the edges of the stimulus. This property should be set to False for any ImageStim used by our special features to avoid pixel value transformations. For an example implementation, see the code in the section Enabling Pixel Mode and creating a Pixel Trigger.
Coordinate system for the TRACKPixx3
The TRACKPixx3 uses a Cartesian coordinate system, where the units are in pixels. For example, on a 1920 x 1080 display, the bottom right corner pixel is at [960, -540].
PsychoPy also uses a Cartesian coordinate system, but the default units are normalized. For instance, their bottom right corner value will be [1,-1] by default. We recommend switching your PsychoPy window units to ‘pix’ or pixels to ensure good agreement between these two systems.
Special video modes and sequencers
Our special high-bit-depth video modes and high-refresh-rate sequencers can be used in Python. However, we recommend using an IDE, not the Builder, for these modes. Most of our video modes require custom shaders or image formatting that cannot be easily incorporated into the Builder’s pre-defined window properties.
For an example of this formatting and how it can be automated, we have an example of 480Hz or “Quad4x” mode available here: [download]. More examples using our custom shaders will be made available soon.
Still have questions? Pypixxlib code not working? Looking for a specific demo? Contact our technical support staff at support@vpixx.com. Our team of trained vision scientists and software developers can help you get our tools up and running.
References
Peirce, J. W., Gray, J. R., Simpson, S., MacAskill, M. R., Höchenberger, R., Sogo, H., Kastman, E., Lindeløv, J. (2019). PsychoPy2: experiments in behavior made easy. Behavior Research Methods. 10.3758/s13428-018-01193-y