The Psychophysics Toolbox is a free set of Matlab functions for vision research (Brainard, 1997; Pelli, 1997). The software runs in the Matlab numerical programming environment. It's compatible with Matlab 5.2.1 or better. A polished version is available for Macintosh computers running Mac OS 9, and fairly complete but not-yet-thoroughly-tested beta versions are available for Mac OS X and Windows. The Psychophysics Toolbox is free and available for downloading. There are several inexpensive ways to buy Matlab.
We now use this environment for all our experiments and modeling. The key test, for us, is how long it takes a new student to implement a new experiment. Canned programs fail because they usually can't do a really new experiment. In C, it generally takes six months (including learning C). In Matlab, with the Psychophysics Toolbox, it takes a few weeks (including learning Matlab). The tutorial lays out the key ideas that will get you started.
Brand-new users who've never programmed before will find that they're learning three things: Matlab, how to create stimuli and measure responses, and how to organize an experiment. There's almost no overlap between those three topics, but, of course, you'll be learning Matlab throughout. Most of the included demos (type "help psychdemos") focus on how to create stimuli and measure responses. PhaseDistortDemo and QuestDemo focus on how to organize an experiment. For learning the language, many people say they liked the Matlab manual. Others skipped the manual and learned by trial and error. Everyone uses HELP frequently. It's one of Matlab's best features.
The Psychtoolbox is popular. It's been downloaded thousands of times ( Win, Mac OS 9, Mac OS X). Its forum is quite active, averaging more than 3 messages a day. Principal investigators and their collaborators have identified 124 grant-supported projects that use it. And 404 papers cite it.
The attraction of using computer displays for visual psychophysics is that they allow software specification of the stimulus. Programs to run experiments are often written in a low-level language (e.g. C or Pascal) to achieve full control of the hardware for precise stimulus display. Although these low-level languages provide power and flexibility, they are not conducive to rapid program development. Interpreted languages (e.g. BASIC, LISP, Mathematica, and Matlab) are abstracted from hardware details and provide friendlier development environments, but don't provide the hardware control needed for precise stimulus display. The Psychophysics Toolbox is a software package that adds this capability to the Matlab application on Macintosh and Windows computers.
Matlab is a high-level interpreted language with extensive support for numerical calculations. The Psychophysics Toolbox interfaces between Matlab and the computer hardware. The Psychtoolbox's core routines provide access to the display frame buffer and color lookup table, allow synchronization with the vertical retrace, support millisecond timing, and facilitate the collection of observer responses. Ancillary routines support common needs like color space transformations and the QUEST threshold seeking algorithm. The Showtime extension (formerly called "QT") makes it easy to save dynamic stimuli as QuickTime movie files that can be displayed on the web.
MovieDemo.m is a simple Matlab program that uses the Psychophysics Toolbox to display a growing disk. First it creates a series of images in memory, then it plays the movie, displaying a new image on each frame.
Other example programs provided in the distribution implement drifting and flickering gratings, visual search, detection of dot symmetry and motion, lightness matching, magnitude estimation of line length, and measurement of contrast thresholds in static noise. Many of these were designed as experimental modules for an undergraduate laboratory course on perception and may be useful for teaching. In an email poll, users reported using the toolbox to measure a variety of psychophysical thresholds (e.g. detection of gratings and letters in noise), to display stimuli for functional MRI and electrophysiological experiments, to measure asymmetric color matches, to evaluate image compression algorithms, and to study categorization, perceptual learning, visual search, and visual object recognition.
We are building a library of user-supplied programs that run real experiments and demos, as examples for others. We invite everyone to send software to the Psychtoolbox forum, which automatically archives the message and enclosure. We add links on the library page to programs in the forum that appear to have enduring value.
Complete sources are provided, for interested users. Some of the functions are written in Matlab, but the key routines are Matlab extensions (MEX for Mac and DLL for Windows), which are called as high-level Matlab functions but written in C. (Users of the Psychophysics Toolbox do not need to know C.) The core routines rely on the VideoToolbox subroutines (Pelli, 1997), which provide low-level control of the Macintosh display hardware, and corresponding routines that Allen Ingling wrote for Windows.
Version 1 (released 1995 [Mac only]) was written by David Brainard, with some help from others and support from an NSF ILI grant awarded to David Brainard. Version 2 (released 1996 [Mac] and 2000 [Win]) was written by David Brainard and Denis Pelli, with help from others, and is actively maintained (see changes [Mac & Win] and credits [Mac & Win]). The Win 2.5 and Mac OS X 1.0 releases are Allen Ingling's work. Send questions and comments to the psychtoolbox forum at: firstname.lastname@example.org. Send requests for update notification to email@example.com.
If you want to acknowledge use of this software when you publish your research, you might say something like this,
"We wrote our experiments in Matlab, using the Psychophysics Toolbox extensions (Brainard, 1997; Pelli, 1997)."
D. H. (1997) The Psychophysics Toolbox, Spatial Vision 10:433-436.
We're happy and grateful to find that more and more users are now citing their use of this software (see list). Getting this credit helps us justify the time we continue to devote to developing and maintaining this free software for use by the entire vision community. We note that some users of the Psychophysics Toolbox have been citing only Brainard and not Pelli, presumably because these users aren't aware that they are using the VideoToolbox. However, as our changes page documents, our enhancements and debugging of the Psychophysics Toolbox are roughly evenly split between changes at both levels of software. Mentioning the VideoToolbox by name isn't important, so we've dropped it from the suggested citation, above, but citing the active contributions of both authors is important, as we both want credit for our contributions to your research.
We're always interested in hearing about how the software's being used. Everyone is welcome to discuss their work with the psychtoolbox on the Psychtoolbox forum at firstname.lastname@example.org
(Psychtoolbox 2.11, which is still available, supports both PowerPC and 68K Macs ; subsequent versions have only been tested on PowerPC.)
A National Eye Institute vision core grant to NYU is providing a full-time programmer to support the Psychophysics Toolbox: Allen Ingling. Allen's first priority was to make the Windows version to work well. Version 2.5 is the fruit of that effort. Current development is directed toward a new version for Mac OS X that is internally based on OpenGL. After the OS X Psychtoolbox is complete we will update the Windows Psychtoolbox to use OpenGL and support dual displays.
Presently we have no plans to provide a Linux Psychtoolbox. We'd be delighted to help anyone who would like to help write a work-alike for Linux. Let us know.
Also see "Future of the Psychophysics Toolbox".
For the most part, the routines, the example code (excluding the library page), and the documentation are copyright (c) David Brainard and Denis Pelli, 1992-2002. The VideoToolbox routines are copyright (c) Denis Pelli. SoundPlay and certain code fragments are copyright (c) Apple Computer. The MEX-file interface is copyright (c) The MathWorks. Routines getftype and setftype are copyright (c) Erik Johnson. The instructional handouts are copyright (c) David Brainard or David Brainard and John Philbeck, as noted on the individual handouts. Showtime.mex (formerly called "QT.mex") is the Matlab version of Showtime, copyright (c) Andrew Watson, Cesar Ramirez, James Hu, and Denis Pelli. Other copyrights are noted in appropriate places.
This package may be distributed freely as long as it is distributed in its original form. It may not be sold without permission. To make the software as widely available as possible, we hereby grant permission to include the toolbox archives, intact, in CD-ROM collections selling for less than $100, but we retain all rights. To acknowledge use of the software, you might consider a citation. An example citation is provided here.
The various instructional handouts may be included in photocopied readers used for instructional purposes. Any such use should include an acknowledgment, copyright notice, and a notice that reproduction is by permission.
The package is distributed as is, without any warranty implied. No liability is accepted for any damage or loss resulting from the use of these routines.
|Drifting grating. QuickTime movie created by ShowtimeDemo.m.|
motion. First devised by Gunner Johansson in 1973, these point-light
animations depict complex human actions with only a handful of motion
tokens strategically placed on the body and limbs (Ahlstrom, Blake &
Ahlstrom, 1997). We use these animations for psychophysical (Grossman
& Blake, 1999) and fMRI studies (Grossman, Donnelly, Price, Morgan,
Pickens, Neighbor & Blake, in press, Journal of Cognitive
Emily Grossman, Vanderbilt University
|Selectivity of local and
global motion detectors. ON-centre DOGs move
coherently towards the center, OFF-centre DOGs move incoherently. Each has
a lifetime of one displacement. P.J. Bex and S. C. Dakin (2000) Narrowband
local and broadband global spatial frequency selectivity for motion
Peter Bex, University of Essex
|2nd-order gabor in the
spatial frequency domain. “The idea was inspired
by Solomon and Pelli (1994). I asked Josh
how he did the Nature cover, and I expanded his method of varying
spatial frequency in the horizontal direction to having it vary in the
vertical direction as well. To do this required making as many images as
there are pixels, and saving one pixel from each image that had the
spatial properties required by its location in the gabor.”|
Ben Singer, University of Rochester
|Stimulus used to test the
effect of motion on object recognition.|
Stone JV (1999) "Object recognition: view-specificity and motion-specificity," Vision Research, 39, 4032-4044.
Jim Stone, University of Sheffield
|Chromatic induction with
remote chromatic contrast varied in magnitude, spatial frequency and
chromaticity. Barnes CS, Wei J, and Shevell SK (1999) Vision Research,
Steve Shevell, University of Chicago
motion. Bellefeuille, A. & Faubert, J. (1998) Independence of contour
and biological motion cues for motion-defined animal shapes.
Perception 27, 225-236.|
Jocelyn Faubert, Université de Montréal
David Brainard, Denis Pelli & Allen Ingling.
(Mail sent to the public Psychtoolbox forum is read by David, Denis and Allen.)