Understanding Computer Monitors: Resolution, Color Depth, and More
|Published:||Feb 26, 2008|
You can spend all day using the computer and not once think about the part you're actually looking at: the monitor. Even inexpensive monitors tend to perform reliably for a long time--I've been using mine for four years without trouble and expect to get several more out of it--so it's easy to ignore them even while staring at them for hours on end. But as you might guess, there's a lot to the underlying technology, and especially when you're buying a new monitor there are some things that will be helpful to know.
Types of displays
To begin, "monitor," "display," and "screen" are essentially interchangeable terms. There are two major display technologies on the market: CRT and LCD. CRT stands for Cathode Ray Tube, and the technology--which consists of an electron gun that is fired at a fluorescent glass screen--has gone through many, many enhancements since its invention more than a century ago. You can recognize a CRT by its depth (many CRTs are more than a foot deep) and considerable weight (owing to its glass construction). CRTs have several attractive qualities including excellent brightness, contrast, and response time (the time it takes for one image to be completely replaced by another). However, their size and weight have recently contributed to the rising popularity of LCD monitors.
An LCD--short for Liquid Crystal Display--monitor uses a modern version of the same technology that prints numbers on the face of a digital watch. Liquid crystal molecules are sandwiched between layers of glass, and when electricity is applied to them their opacity--how much light is allowed through--changes. A backlight behind the LCD illuminates the screen. The obvious advantages of LCDs are that they are very thin, allowing for large displays that are only a few inches (or even quarters of an inch) thick. They are consequently much lighter than their CRT counterparts, and LCDs also consume less electricity. Only in the past decade have LCD screens become cheap enough to manufacture to begin to displace CRT monitors. LCDs are commonly called "flat screen" monitors, but not all flat screens are LCDs. Additionally, essentially all laptop screens (as well as those on most handheld devices) are LCDs.
CRT and LCD aren't the only display technologies out there: Plasma screens are common among very large displays, and OLED (Organic Light-Emitting Diode) displays may also challenge LCDs in the near future. But for the moment, almost every computer monitor sold is either an LCD or CRT.
Types of connectors
There are three major types of connectors--that is, the part that you use to physically attach a computer to a monitor--on the market today. Covering all of the technical differences between them would take an entire article; instead, I'll give you the basics and teach you how to identify them.
VGA (Video Graphics Array) an analog connector, and the oldest monitor connector that's still common in modern PCs. The connector has three rows of five pins each, and for easy identification is often, but not always, colored blue.
DVI (Digital Visual Interface) is a newer connector, often (but again, not always) colored white. It's similar in size and shape to VGA, but slightly longer. There are actually five distinct species of DVI connector, including DVI-A, which carries only analog signals, DVI-D, which carries only digital, and DVI-I (for Integrated), which carries both. DVI connectors have up to 29 pins, including a wide, flat pin that's easy to identify. Many DVI monitors are VGA-compatible and can be connected to VGA devices with a simple adapter, though in doing so one eschews the benefits of the digital connector like less susceptibility to electrical interference and signal loss.
Finally, HDMI (High-Definition Multimedia Interface) is gaining popularity in high-end displays. It's a wide, flat connector with 19 pins that can carry both digital audio and video. For video, HDMI is backwards compatible with DVI-D using an adapter. However, it has a caveat: HDCP, or High-bandwidth Digital Content Protection, which is a proprietary copy protection scheme built into many HDMI devices. The intent of HDCP is to prevent HDMI displays from showing "unauthorized" content, but in practice, it makes non-HDCP devices incompatible with some HDCP devices. For example, if you have a Blu-ray player with HDCP but a monitor without it, the player may be able to play a Blu-ray movie but the monitor might not be able to show it. One solution is to make sure all of your HDMI devices support HDCP. Another is to avoid HDMI entirely and stick with DVI until a more consumer-friendly standard comes along.
Now that we've established the types of monitors and their connectors, let's talk about their capabilities. There are two different factors, here: The monitor itself and the video card. The video card, more properly called a display adapter, is the part of the computer that sends signals to the monitor. In many computers the display adapter isn't actually a separate card; rather, it's integrated into the motherboard. The video card and monitor each have their own sets of limitations regarding resolution and color depth. Let me explain.
Resolution refers to the number of pixels on your screen, described as width (in pixels) by height. For example, my laptop screen has a resolution of 1024x768--that's 1,024 pixels wide and 768 pixels tall. One potentially confusing thing about resolution is that it's completely independent of the physical size of your monitor. Some very large monitors have relatively low resolutions, whereas some small screens have very high resolutions. Generally speaking, monitors capable of high resolutions are more costly to manufacture and as a result have higher price tags.
One thing that even novices notice about resolution is that at higher resolutions many things--desktop icons and the Start menu, for example--appear smaller. The reason is this: These elements have a fixed size in pixels (the default for desktop icons is 32 pixels width and tall, for example). Higher resolutions put more pixels into the same number of inches, making each pixel smaller. Regarding resolution there's an important difference between CRT and LCD monitors: a CRT monitor is capable of multiple different resolutions, whereas an LCD screen has only one "native" resolution. Some people make the mistake of choosing a resolution in Windows other than the correct resolution for their LCD, sometimes choosing a lower resolution to make things look bigger and sometimes a higher one to "fit" more on the screen. What happens in this case is that the monitor, having a fixed number of pixels to work with, has to either fit more than one pixel's colors onto a single pixel, or has to stretch a single pixel's color across several. The end result is that the image becomes blurry or pixelated and detail is lost. I highly recommend against using anything other than the native resolution on an LCD screen--when I do, it gives me a headache.
Aspect ratio and "wide" screens
A concept closely related to resolution is aspect ratio--that's the ratio of a monitor's width to its height. Until recent years, almost all consumer-grade monitors had the same aspect ratio: 4:3, meaning for every four pixels in width it has three pixels in height. You can test this by dividing the width by the height--for example 1,024 divided by 768 is 1.333... (that is, about 1.33:1), which is equal to 4 divided by 3, or 4:3. However, "widescreen" monitors have rapidly been gaining popularity. Widescreen can mean a lot of different things, but in computer monitors it usually refers to an aspect ratio of 16:10 (also referred to as 8:5 or 1.6:1, which are all equal). A typical widescreen resolution is 1280x800.
This is a lot of numbers, but in researching monitors you'll just as often come across acronyms--every common resolution has a name. A resolution of 640x480 shares its name with the connector--VGA--but such a low resolution is rarely found in monitors (you might see it in handheld devices, however). Names for higher resolutions stem from VGA: SVGA, for Super VGA, refers to 800x600. My laptop screen--1024x768--is XGA (eXtended Graphics Array). My desktop LCD is kind of an oddball--it's SXGA, or 1280x1024, giving it an atypical aspect ratio of 5:4. Widescreen resolutions usually have a W in front of their names, like WXGA, and have the same height in pixels than their 4:3 cousins, but greater width. Wikipedia has a useful chart showing the most common resolutions, their names, and their aspect ratios.
Color depth is, in essence, the number of colors that can be displayed on the screen at once. It's usually discussed as a number of bits: 8 bits, 16 bits, 32 bits, etc. There's a little math involved in understanding this: The number of bits refers to how many bits (i.e. binary digits) it takes to represent a particular number of colors. 8 bits can represent 28 colors (that's two to the eighth power), or 256 colors. Sixteen bits is 216, or 65,536, and 24 bits, sometimes called "Truecolor," allows for 224 colors--more than 16 million. A 32-bit setting is available on most modern video cards, but in practice does not actually mean 232--more than 4 billion--colors. Instead, 32-bit color usually means 24 bits for color data and the remaining 8 bits for other graphics data.
In general, the higher the color depth you choose, the better your screen will appear when looking at things with lots of color like photos and videos. At color depths of 16 bits and lower your video card will be forced to use a technique called dithering to display some colors--basically, using patterns of two different colors to produce an in-between color. While reasonably effective, dithering is fairly noticeable and, in my opinion, a little ugly. The only advantage to using a lower color depth than the greatest one available to you is that it may use less memory; most modern systems, however, are well equipped for 24- and 32-bit color.
Changing your display settings in Windows
It's easy to change your display settings. In Windows XP, simply right-click on an empty spot on the desktop and choose Properties, then click on the Settings tab. In Windows Vista, right-click on the desktop, choose Personalize, and then click on Display Settings. You can use the slider in the lower left to adjust your resolution--remember, though, that if you have an LCD monitor only the "native" resolution will work without some blurring or pixelation. On the right you can choose your color depth.
If you have more than one monitor connected to your PC you'll see a picture of each, and can configure them individually by clicking on them and adjusting the settings. Additionally, you can move the pictures around by clicking and dragging to better represent how they sit on your desk.
Blogger since 1999, Jordan Running went pro in 2005 and never looked back. Sometimes programmer, occasional photographer, and serial tinkerer, he decided to to switch to Linux in 2001 but just hasn't quite gotten around to it yet.