Response time is perhaps the most famous parameter of LCD monitors, yet it is not a simple one.
There are currently two methods to measure it: ISO 13406-2 and the so-called GtG method. The first method is to measure the total time it takes to switch a pixel from black to white and back to black and the problem is that for many existing matrix manufacturing technologies this measurement method yields the minimum possible value.
The GtG method is a more honest one. It measures the average time it takes to perform transitions between all the possible pixel states (GtG stands for Gray to Gray, meaning all transitions between halftones). So, if the matrix yields a very fast black-white-black transition but is slow on halftone transitions, the fast transition will account for an insignificant part of the total and the GtG average will describe the matrix’s real speed truthfully.
Although the manufacturers use the method that provides the best result for the particular monitor (e.g. a response time of 5 milliseconds ISO is specified for modern monitors with RTC-less TN matrixes while monitors with RTC have a specified response of 4 milliseconds GtG, but the practical difference between such monitors amounts to 300-400% rather than 25% as the numbers seem to suggest), we measure response time according to the GtG method as the more objective and illustrative measure of the matrix speed.
To perform the measurement, a photo-sensor is attached to the monitor screen that is tracking any changes in brightness of the pixels.
From a technical point of view, this photo-sensor consists of a Vishay BPW21R photodiode and a current-voltage converter based on a low-noise operation amplifier AD795JR from Analog Devices. The photo-sensor is linear in a brightness range of 0.5 to 700 candelas per sq. m and has its own response time of about 0.3 milliseconds, which allows testing any modern LCD monitor with a high degree of precision. To work with lower levels of brightness, the photo-sensor circuit includes an amplification switch that increases its sensitivity tenfold.
The sensor sends its signal to a Velleman PCSU1000 oscilloscope which records any changes in brightness of the pixels under the sensor over time with a resolution of hundredths of a millisecond. A special program processes the oscillograms and calculates the time it took to change the brightness, i.e. the response time.
The result of the measurements is shown in our articles as a 3D histogram whose one X-axis shows the initial state of the pixel and the other X-axis shows the final state of the pixel. The height of the appropriate column shows the time it took to switch between these two states.
In most cases it suffices to have the average of all the measured transitions (i.e. the response time value as measured according to the GtG method) – and we publish this average, too.
Talking about specific numbers, a monitor with a response time of less than 5 milliseconds GtG can be considered very fast today. A response time of 10 milliseconds GtG is fast, a response of over 10 milliseconds GtG is rather slow. For comparison, older PVA and MVA matrixes without response time compensation would have a response time of over 15 milliseconds if measured according to the GtG method. Their specified response time, according to the ISO method, was 16 or 25 milliseconds, but as you remember, the ISO method measures the time it takes to perform a black-white-black transition whereas the GtG method averages all transitions to one direction only.
Thus, monitors with a specified response time of 4 milliseconds GtG are fast whereas monitors with a specified response time of 5 milliseconds ISO are slow (as explained in the previous paragraph).