Articles: Monitors

Bookmark and Share

Pages: [ 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 ]

Extended Color Gamut: Highs and Lows

If you want to know what color gamut is, why it is rather small with most of existing monitors, and how it can be enhanced, you can refer to the appropriate section of our article called Contemporary LCD Monitor Parameters: Objective and Subjective Analysis.

Theoretically, a larger color gamut is always an indisputable advantage as it enables the monitor to display colors that a monitor with a smaller color gamut can never show. Do not confuse color gamut with the amount of colors a monitor can display, which is usually 16.2 or 16.7 million colors. These are two complementary things. A color gamut is the range of colors the monitor can display while the amount of colors is how many gradations this range is split into in order to display medium hues or halftones. These two parameters are not directly interrelated. Theoretically, it is possible to make a monitor with four colors and a huge color gamut. Such a monitor would only display pure green, pure blue, pure red or pure white – without any halftones – but these four colors would be indeed very pure.

Thus, you can have a purer, more saturated color on an extended-gamut monitor even if you have a prehistoric graphics card with 16-bit color representation or are an inveterate user of Windows 3.11 for Workgroups. A color gamut is a hardware property of a monitor that does not depend on what system the monitor is connected to.

So, the two mentioned parameters do not affect each other, yet they should be discussed both together in some situations. It is obvious that the number of colors determines the difference between two adjacent colors. The more colors the monitor can display, the smaller this difference is. The entire space of colors the monitor can reproduce is split into 16.7 million dots, each specific color being as accurate as one of these dots.

And when this space – the color gamut – gets larger, but the number of dots remains the same, the difference between the adjacent dots grows bigger. So, although an extended-gamut monitor can show more colors in a physical sense, it does so less accurately. This lack of color precision can be observed by means of smooth color gradients: they appear to be banded, each band corresponding to one color dot.

In fact, you can see this effect even with the 24-bit color representation that is standard today (graphics cards work with 32-bit color representation but there are only 24 bits that describe color proper, the other 8 bits being employed for auxiliary purposes; in fact, these extra 8 bits were introduced only because graphics cards find it easier to process 4-byte numbers than 3-byte ones). Try to stretch a gradient from red to black to full screen and you will see it to be striped, not smooth, even on a best of LCD monitors (bad monitors will even add wide and irregular bands of their own).

The banding of color gradients is going to be a little more conspicuous on extended-gamut monitors if we use the same 24-bit color format.

The only solution is to increase the color precision to 30 bits so that each color component was represented by 10 bits. This would increase the total number of colors, reduce the size of each color dot, and solve any gradients-related problems.

Alas, even though graphics cards have supported the transfer of 30-bit color via the DVI interface for long already (ATI’s cards have offered this support since the X1000 series, for example), this is not a widespread feature as yet. Only few monitors, such as the expensive NEC SpectraView Reference 2180WG LED, support a 30-bit interface, and there is not much support on the software side, either.

Although the lack of colors on extended-gamut monitors is not a serious problem, especially for home users, it is a problem anyway. After all, we are talking about professional monitors that can be used for prepress and onscreen proofing, i.e. when even minor defects in color reproduction can have serious consequences.

Next goes a more serious problem. Working with color and software, the graphics card and monitor both operate not with physical measurement units but with some formal numbers, from 0 to 255 for each of the basic colors. For example, {0; 255; 0} is not green, it is just a set of numbers. It will become green if we assume that this set corresponds to the monitor’s showing a green subpixel.

So, the problem is that a green subpixel has different color on an ordinary monitor and on an extended-gamut monitor. It is greener on the latter. That is, it is purer, more saturated. If you put two such monitors next to each other and display the color {0; 255; 0} on both, you will see a pure green on the extended-gamut monitor and a green with a noticeable yellowish hue on the ordinary monitor.

The transformation of a formal value (a number) into a physical value (a specific color perceived by the eye) is performed by the monitor’s LCD matrix. But the matrixes are different whereas software is mostly oriented at one and the same standard called sRGB.

As a result, monitors with an extended color gamut – which is extended relative to the standard sRGB gamut – will distort colors when displaying sRGB-oriented pictures prepared in sRGB-oriented software that does not know anything about non-sRGB monitors. The monitor will just stretch the sRGB-oriented picture out to fit its own gamut. Not only the pure colors, but also halftones will shift. The only exception is white and gray which are going to look correctly on any monitor unless the monitor is set up badly.

The most typical model of an extended-gamut monitor is the one that uses lamps with improved phosphors. Such monitors differ from ordinary sRGB monitors with a more saturated green. Thus, all the halftones will be somewhat shifted on them in the direction indicated by the white arrows in the diagram above (the black triangle is the standard sRGB gamut, and the white triangle is the effective gamut of the extended-gamut monitor sRGB-oriented images will be stretched out on).

People who have some basic knowledge of any measurement tools may argue that every measurement tool works like a monitor: it shows some formal units as physical properties. An ordinary weighing scale shows not the weight of something but the angle of the arrow. But we know that the angle depends on the weight, so we can write numbers denoting kilograms, not degrees, below the arrow.

Can this procedure be applied to monitors so that image-processing software could correct the picture for the current monitor’s color gamut? Yes, it can. This procedure is known as hardware calibration.

Pages: [ 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 ]


Comments currently: 1
Discussion started: 11/01/08 10:17:50 PM
Latest comment: 11/01/08 10:17:50 PM

View comments

Add your Comment