The gamma curves are good as regular monitors go, but not quite good for a professional model. The value of gamma is somewhat too high and the curves lie lower than the theoretical one as the consequence. The resulting image is darker and has more contrast than it should.
The monitor is set up better in the AdobeRGB mode. The curves rise up, nearly coinciding with the theoretical curve. Anyway, the blue curve differs somewhat from the ideal one.
The gamma curves setup in the sRGB mode is better than in Custom but worse than in AdobeRGB mode.
I ran the calibration procedure for this mode in the Emulation section of Natural Color Expert:
Just as you could expect, Natural Color Expert makes no attempt to correct the shape of the gamma curves. The diagram didn’t change almost. Calibration with the calibrator’s native software gives better results, but the correction table is written into the graphics card rather than into the monitor then.
The monitor offers 12 color temperature presets in the Custom mode plus a manual setting but these presets are called something like Cool3 rather than specific numeric values. Color temperature is fixed in the sRGB and AdobeRGB modes. In the Calibration and Emulation modes it is specified in Natural Color Expert, so I didn’t include them into the table below.
Alas, the setup quality is far from perfect. Such a big difference between the temperatures of different grays is too much for a professional monitor. For comparison, this difference varies from tens to hundreds degrees in the NEC MultiSync LCD2190UXi. Here, the difference can be as big as 2000K and more. The sRGB mode delivers a temperature of about 6500K which is appropriate for the namesake standard. The AdobeRGB mode is colder for some reason although it should produce 6500K as well (illuminant D65, to be exact). The temperature proved to be about 7000K in reality. This can be corrected by means of calibration, though.
One of the common problems of monitors with LED backlight is about the uniformity of color temperature. The problem is caused by the use of multiple LED triads. If the parameters of the LEDs in different triads differ a little, the triads will produce somewhat different light.
To check out if this problem concerns the SyncMaster XL24 I measured the color temperature of white in 25 points of the screen.
So, the difference amounts to about 400K. In other words, not only different levels of gray but also different points of the screen differ in temperature. This problem can be solved by culling LEDs that meet specific requirements or by setting up each monitor individually like it is done to achieve uniform brightness of white.
The maximum brightness is about 200 nits, and the contrast ratio is 400:1. These values are measured with a ColorVision Spyder Pro calibrator which produces somewhat understated results. The level of brightness matches the requirements of the namesake standards in the sRGB and AdobeRGB modes but the contrast ratio does not. The level of black is as high as 2.77 nits in the AdobeRGB mode whereas the AdobeRGB standard demands it to be about 0.56 nits. Both parameters – brightness and contrast ratio – can be set up during the calibration of the monitor in the Emulation mode, but it would be better to have them set up correctly right at the factory.
Although the XL24 is based on a PVA matrix which is inherently slow, it is equipped with response time compensation. As a result, its speed proves to be good enough for games, let alone for work. The response time average is only 6.7 milliseconds (GtG) with a maximum of 16.5 milliseconds.
There are RTC-provoked artifacts, too. They show up as light or dark edges around moving objects. The average value of the RTC miss is 9.0% with a maximum of 42.9%. The artifacts won’t be too annoying, yet you can notice them if you want to.
I’ll give you a summary of the XL24 in the Conclusion. Right now let’s proceed to the 30-inch XL30 model.