The performance of a video card is determined by several characteristics, including:
Maximum screen resolution
Bit depth (or color depth)
Amount of memory
Speed of the processor
Bandwidth of the connection between the controller and the computer's chipset
The resolution of a screen is based on the number of pixels it takes to fill the entire screen. When you increase the resolution, images on the screen become smaller because Windows defines their dimensions in pixels. On the other hand, if you use the same resolution on a bigger monitor screen, each pixel occupies more space, so the windows, icons, text, and pictures will all be bigger. Because the graphics controller doesn't know what size monitor is connected to it, a controller can support several different resolutions.
The same resolution is easier to read on a large screen. The text and pictures that are tiny on a 15-inch monitor at 1024 × 768 pixels will look fine on a 19-inch screen. In Windows, you can compensate for this difference by changing the DPI (dots per inch) setting in the General tab of the Advanced Display Properties window (Control Panel Display Settings Advanced.
The best screen resolution for your monitor is a subjective choice, but Table 10.1 shows some commonly used settings. If you don't like the way the recommended resolution looks, try moving up or down to the next option that maintains the correct aspect ratio.
800 × 600
1024 × 768
1280 × 1024
1600 × 1200
Most flat-panel displays (including laptop screens) perform best at one particular screen resolution. The manual supplied with the monitor or laptop should tell you the best setting for your screen.
In addition to the size of the screen, you should also consider the screen's aspect ratio when you set the resolution. The aspect ratio is the proportion of width to height, reduced to the smallest possible numbers. Because most CRT monitors used picture tube designs that were originally made for television, their aspect ratios are the same as traditional broadcast TV. Four units wide and three units high, or 4:3. 640 × 480, 800 × 600, 1024 × 768, 1400 × 1050, and 1600 × 1200 are all 4:3 aspect ratios.
When flat-panel monitors came along, their designers usually retained the same 4:3 aspect ratio. However, a few monitors have a slightly narrower 5:4 aspect ratio, so Windows often supports at least two 5:4 resolutions (1280 × 1024 and 1600 × 1280). Still other monitors (especially in laptop computers) use the wide-screen aspect ratio used by high-definition TV and DVD movies. If your monitor or laptop has a wide screen, it also supports a wide-screen aspect ratio.
Bit depth, also called color depth, is the number of bits needed to specify a different value to each shade. So the color depth of a black-and-white image is 1 bit, and a 256-color image has a color depth of 8. The maximum color depth on many computer graphics controllers is 32 bits, equal to more than 16 million colors.
The screen refresh rate is the number of times per second the graphics controller renews the image on the screen. On a CRT, the refresh rate is the speed that the electron guns scan across the inside of the screen; on a flat-panel monitor, it's the rate at which the controller renews each pixel.
Every time the controller renews the image on a CRT screen, the image goes dark for a tiny fraction of a second before the new image appears. Your brain fills in the image during those dark moments, but if the refresh rate on a CRT monitor is too slow, you might notice that the screen appears to flicker rapidly, especially if you're working under fluorescent lights. To eliminate this problem, increase the refresh rate to 66 Hertz or more (explained later in this chapter).
Flicker is not a problem with LCD displays, so the best refresh rate is the native rate for that monitor. Most LCD monitors have a native refresh rate of 60 Hz, but some newer models might be faster. Consult your monitor's manual or specifications sheet to find the right setting for your monitor.
Because screen resolution, bit depth, and refresh rate all contribute to the number of bits that the controller must renew every second, the maximum values might be limited by the amount of memory on your video card. That's why the maximum refresh rate on a card with a limited amount of RAM drops as you increase the resolution; if you set the display settings to a high refresh rate, the card might automatically reduce the color depth.
In order to estimate the amount of memory your graphics controller needs, you must start by understanding how much memory it takes to create an on-screen image. For two-dimensional images, the minimum is equal to the resolution of the image, multiplied by the color depth.
To learn how much memory the graphics controller inside your computer has, follow these steps:
From the Windows desktop, use your mouse to move the cursor away from any icon.
Right-click the mouse.
When the pop-up menu appears, choose Properties at the bottom of the menu. The Display Properties window appears.
Click the Settings tab. The Settings screen shown in Figure 10.8 appears.
Figure 10.8: The Settings screen in the Display Properties window controls graphics options.
Click the Advanced button near the bottom right corner of the screen. A new properties screen for your graphics adapter appears.
Click the Adapter tab. The screen shown in Figure 10.9 appears. The Adapter Information box in the middle of the window includes the make and model of the chipset and the amount of RAM on the graphics card.
Figure 10.9: The Adapter tab in the video card's Properties window identifies the chipset and shows the amount of RAM memory.
A graphics controller uses onboard RAM to hold several kinds of data:
The frame buffer: This is the section of memory that sends the image to the monitor. It must be big enough to hold a complete image.
The back buffer: The controller assembles the next image in the back buffer while the frame buffer feeds a completed image to the monitor. So the size of the back buffer must be equal to the size of the frame buffer.
Therefore, a two-dimensional display requires enough video memory to hold at least two complete bitmap images. So a 768 × 1024 image needs at least 3MB of video memory, 1024 × 1280 needs at least 5MB, and 1200 × 1600 requires a minimum of 8MB. Video cards with more RAM than those minimums might produce somewhat better 2-D performance, but you can expect an old card with just 8 or 16MB of RAM to produce an adequate 2-D picture, although they might not support 32-bit color quality. But remember that Windows Vista uses 3D images, so you have to replace an underpowered video card when you upgrade your operating system.
Three-dimensional images require a lot more RAM because they have to support these additional elements:
The z-buffer: In three-dimensional images, the z-buffer holds the data that places one item in front of another and creates the illusion of perspective. The depth of the z-buffer (most often 24 or 32 bits) defines the quality of the 3-D image. The size of the z-buffer is equal to the image size multiplied by the buffer depth.
Polygons: Polygons are the shapes that contribute to the quality of a 3-D image. A graphics controller with more polygons produces a more realistic picture, but it doesn't necessarily require more memory.
Textures: Textures are the shading, reflective patterns, and other characteristics of the surfaces of 3-D objects that give them a realistic appearance. A complex scene might include multiple textures. Complex textures can use large amounts of memory to support extensive amounts of detail.
In today's marketplace, even the most inexpensive low-end graphics controllers have 32MB, 64MB, or even 128MB of RAM. That's more than enough memory for any 2-D application. Three dimensions are a completely different matter. A 3-D image takes many times more RAM to hold all that additional information about each screen. That's why high-end graphics cards often have 256MB or even 512MB of RAM. 128MB is the bare minimum needed for games, Windows Vista, and other 3-D graphics.
The second important specification that describes the quality of a video controller is the clock speed on the graphics processor. Faster is better, especially for 3-D images, because a faster processor can render complete images more quickly. 250–300 MHz is plenty for non-gaming users; high-end 3-D video cards can reach 500 MHz or more. However, there's more to quality than just raw speed; some very fast video cards can produce images that are worse than the ones from other, slower cards.
In games and other 3-D programs, and for playing streaming video, the frame rate, or number of complete frames the processor can produce per second, is another indicator of the processor's speed. The frame rate is not the same as the refresh rate, because it measures the number of completely different images the controller can produce each second, rather than simply the number of times that the controller renews the same image.
The fastest graphics processor chips can consume a lot of power. In order to reduce the power drain on the PCI bus or the AGP socket, many high-end video cards have a dedicated power connector that plugs into a cable directly from the computer's power supply.
Bandwidth is the rate at which the graphics processor and the controller card's on-board memory exchange data. The important bandwidth specification is the memory interface, which describes the number of simultaneous bits that move into the graphics card. Low-end cards use a 32-bit or 64-bit memory interface, but for high-performance operation, 128-bit or even 256-bit interfaces allow the graphics controller to produce much better images.