Catalog of Regulatory Science Tools to Help Assess New Medical Devices
This regulatory science tool is a laboratory method for the evaluation of contrast perception on head-mounted displays.
Technical Description
This regulatory science tool (RST) is a method for evaluating contrast perception on medical extended reality (MXR) devices using virtual (VR) or augmented reality (AR) head-mounted displays (HMDs). Image quality evaluation on VR and AR HMDs is primarily based on optical bench measurement on a single eyepiece. However, this approach generally requires complicated bench setup that emulates the human eye geometry. In addition, binocular image quality discrepancy on HMDs may affect the perceptual performance that is missing from monocular bench measurement. This RST describes a method and software platform to characterize the image quality on VR and AR HMDs using human perceptual experiments. Specifically, the contrast sensitivity response (CSR) of human participants is measured at multiple positions across the HMD field of view (FOV).
The tool includes the following components:
- A browser-based test platform [1]: This RST provides a software test platform developed using WebXR Device API [2] to perform human perceptual experiments on VR or AR HMDs in an immersive environment. This tool can be loaded on an HMD web browser that supports WebXR.
- Test method: The test platform includes a user interface to set the input parameters for the perceptual experiment. In the setup stage prior to the experiment, the target dimension, location, spatial frequencies, and background noise should be typed in as input parameters using an HMD controller and/or a Bluetooth keyboard.
During the experiment, a detection target (i.e., a Gabor target) will be shown at the center or various locations across the HMD FOV. The human participant will adjust the contrast of the target using an HMD controller or a Bluetooth keyboard to determine the threshold contrast (i.e., minimum detectable contrast) on the HMD. The experiment will be repeated at all specified target locations and spatial frequencies determined in the setup stage.
- Output data and analysis: All the responses (i.e., determined threshold contrast for each trial) of a human participant will be stored in an output JSON file. A python script is provided to support data analysis by computing and plotting contrast sensitivity as a function of spatial frequencies per measured target position.
Detailed step-by-step instructions on implementing this tool are described in https://didsr.github.io/COPE/documentation/.
Intended Purpose
This test method is intended to characterize the contrast perception on VR or AR HMDs by measuring the contrast sensitivity response. It can be used to evaluate the binocular image quality that has not been well established on VR and AR HMDs. The method can be implemented by medical device developers for HMD image quality evaluation.
Testing
The WebXR test platform has been tested by performing contrast perceptual experiments on a VR HMD (Meta Quest 2) [3]. Specifically, monocular and binocular contrast sensitivity responses at various target locations across the HMD field of view have been measured using the method and software described in this tool.
Compatibility of the tool has been evaluated on multiple VR and AR HMDs including Meta Quest 2, Quest 3, and Magic Leap 2.
Limitations
This tool requires access to a WebXR-compatible browser on the evaluated HMD. Compatibility of the HMD needs to be checked before use.
The rendered image resolution is dependent on the WebXR rendering engine. Spatial frequencies beyond 6 cycles per degree may be subject to aliasing effect and rendering limitations. Performing contrast sensitivity experiments beyond spatial frequency of 6 cycles per degree is not recommended.
Supporting Documentation
[1] GitHub repository: https://github.com/DIDSR/COPE
- The WebXR-based test platform is avaliable at https://didsr.github.io/COPE/
- Documentation and instructions for the tool is available at https://didsr.github.io/COPE/documentation/
[2] WebXR documentation: https://immersiveweb.dev/
[3] Khushi Bhansali, Miguel A. Lago, Ryan Beams, Chumin Zhao, "Evaluation of monocular and binocular contrast perception on virtual reality head-mounted displays," J. Med. Imag. 11(6) 062605, 2024. https://doi.org/10.1117/1.JMI.11.6.062605
Contact
Tool Reference
- RST Reference Number: RST26MX01.01
- Date of Publication: 05/04/2026
- Recommended Citation: U.S. Food and Drug Administration. (2026). COPE: Contrast Perception Evaluation on Head-Mounted Displays (RST26MX01.01). https://cdrh-rst.fda.gov/cope-contrast-perception-evaluation-head-mounted-displays