The monitor is only half of a computer's display system; it must be matched to a display adapter (also commonly referred to as a graphics adapter, video card, or video controller). This lesson discusses the different types of display adapters and design issues that affect quality and performance.
After this lesson, you will be able to:
Estimated lesson time: 25 minutes
- Identify the different types of display adapters.
- Understand display memory and how it affects quality and performance.
- Select the right card for a monitor.
The display adapter has gone through several major evolutions as the nature of PC computing has changed from simple word processing and number crunching, to the graphics-intensive world of Windows and multimedia.
The two "official" video cards for the early 8088-based IBM personal computers (the PC and XT) were matched to the limited capabilities of the early monitors. The Monochrome Display Adapter (MDA) offered a simple text-based monochrome display. This adapter produced an 80-character-wide row of text at a resolution of 720 x 350 pixels. Shortly after that, the Color/Graphics Adapter (CGA) card appeared. It provided up to four "colors" (actually, just different intensities of the monitor's active color: amber, green, or white). In four-color mode, CGA provided a resolution of 320 x 200 pixels. Using just two colors allowed a resolution of 640 x 200 pixels.
With the release of the Enhanced Graphics Adapter (EGA) card, the IMB PC AT became the first PC really able to use color. This adapter was an improved version of CGA, offering a top resolution of 640 x 350 with 16 colors in text-only mode, and 640 x 200 with two colors in graphics mode. The EGA also ushered in the era of video conflicts. It was not fully backward-compatible with CGA and MDA, and some programs would display improperly or even lock up the system. The MDA, CGA, and EGA cards all shared the same connection, a 9-pin d-shell male fitting.
The human eye can distinguish 256 shades of gray and about 17 million variations in color in a scene, the minimum required to produce true photographic realism on a screen. EGA did not even come close. It's aim was to offer the ability to incorporate color in pie charts and other forms of business graphics. Although the first graphics programs did arrive to make use of the EGA's graphics capability, serious computer graphics had to wait for better hardware.
A brief digression to explain pixel depth and video memory demands will help you understand what follows. Both the MDA and CGA adapters were equipped with 256 KB of DRAM (dynamic random access memory). The amount of memory on a display card determines the amount of color and resolution that it can image and send to the monitor. As the desire for better graphics and color displays increased, so did the complexity of graphics cards and with them, memory requirements and cost.
Remember that the image on the monitor is a collection of dots called pixels. Each image placed on the screen requires that code be placed in the adapter's memory to describe how to draw it using those dots and their position in the grid. The MDA cards featured a lookup table (LUT) for each character. For MDA adapters, a code number for that symbol and each position on the grid was stored in memory, and the card had a chip set that told it how to construct each of those items in pixels. The MDA and CGA cards each had 265 KB of memory, just enough to map the screen at their maximum resolution. That's why the CGA card had two different modes: the more colors used, the more memory was required. When it displayed four colors instead of two, the resolution had to drop.
The MDA card was a 1-bit device. In other words, each pixel used 1 bit, valued either 0 or 1 to represent whether a given position on the screen (a pixel) was on or off. To represent colors or shades of gray, a card must use memory to describe color and intensity. This attribute of the display, measured in bits, is known as color depth. Color depth multiplied by resolution (the number of horizontal pixels multiplied by the number of vertical rows on the screen) determines the amount of memory needed on a given display adapter.
The adapters that followed the EGA cards to market all offered more colors and, very quickly thereafter, higher resolution. That, in turn, required more processing. The MDA, CGA, and EGA cards all relied on the host computer's CPU. Although that was sufficient in the days before widespread use of graphical interfaces and lots of color, with the advent of the graphical user interface (GUI), all that changed.
The new generation of display cards started the practice of including their own display coprocessors on-board. Coprocessors, which have their own memory, are tuned to handle tasks that would usually slow down the PC, and many display cards use bus mastering to reduce the amount of traffic on the system bus and to speed display performance. Video coprocessing is also called "hardware acceleration." This uses one or more techniques to speed up the drawing of the monitor image. For example, one or more screen elements can be described without using calculations that have to determine the placement of every pixel on the screen.
These new graphics chips were designed to do one thing: push pixels to the screen as efficiently as possible. At first, the cards that used them were expensive and often prone to memory conflicts with the host CPU. Their growing popularity led to rapid advances in design. In the mid-1990s, a new graphics card was introduced on the market almost every day, and a new processor almost every ten days.
Today, high-performance graphics adapters are the norm. While there is no longer a mad rush to market, the graphics coprocessor is a key element of fast Windows performance. Next, we return to our review of standards and see how the industry progressed to today's world of high-speed, full-color computing.
Graphics artists, engineering designers, and users who work with photorealistic images need more than a coarse, 16-color display. To tap into this market, which was using $40,000 workstations, PC vendors needed more powerful display systems. IBM offered a short-lived and very complicated engineering display adapter, the Professional Graphics Adapter (PGA). It required three ISA (Industry Standard Architecture) slots, and provided limited three-dimensional manipulation and 60 frames-per-second animation of a series of images. It was also very expensive and a dismal failure in the marketplace.
The reason was the advent of the Video Graphics Array (VGA) standard. All the preceding cards were digital devices; the VGA produced an analog signal. That required new cards, new monitors, and a 15-pin female connector. It allowed developers to produce cards that provided the user with up to 262,144 colors and resolutions up to 640 x 480.
The VGA card quickly became commonplace for a PC display system, and the race was on to produce cards with more colors, more resolution, and additional features. VESA (Video Electronics Standards Association) agreed on a standard list of display modes that extended VGA into the high-resolution world of color and high photographic quality we know today. The standard is known as SVGA (Super VGA). The SVGA sets specifications for resolution, refresh rates, and color depth for compatible adapters. On Pentium and later PCs, an SVGA adapter is the standard for display adapters. The minimum resolution needed for SVGA compatibility is 640 x 480 with 256 colors, and most modern adapters usually go far beyond that.
The SVGA specification for 256 shades of gray is in the basic SVGA specifications for true photographic reproduction of monochrome images; it's the number of shades that the human eye expects in a grayscale photo. Color requires the same number of shades for each color in the image to achieve the same level of visual realism. To get 256 shades requires an 8-bit memory address system inside the card, 28 = 256. In the early days of SVGA, vendors worked to increase color quality without significantly increasing cost.
In color mode, an 8-bit card can't display all the colors in a full-color picture, so a LUT is used to figure the closest match to a hue that can't be represented directly. Although this method isn't ideal, it was for several years the state of the art on desktop PCs. Then, early in the age of the 486 processor, came the 16-bit SVGA card, which allowed approximately 64,000 colors. More bits require more memory, more processing requirements, a bigger LUT, and more money. These cards were tuned to be used with bigger monitors, 15 to 17 inches at 800 x 600 or 1024 x 768 resolution. The new systems were too expensive for average users, but graphics professionals and power users generated a large enough market to fuel development.
During the early days of VGA and SVGA, three other graphics-card standards were introduced by IBM for the PS/2. Although they never gained significant market share or full support among adapter developers, they did increase the demand for higher resolution and faster performance. The following list presents the highlights of the evolution of PC graphics standards:
The SVGA adapters were a stepping stone; the growing popularity of Microsoft Windows and scanners pushed the demand for cards that could deliver color of photographic quality. In the early 1990s, several manufacturers introduced add-on cards that could be attached to SVGA cards to deliver 16.7 million colors. Soon after, stand-alone products that offered both SVGA resolution and true-color operation arrived. These adapters, known as true-color or 24-bit displays, come with coprocessors, lots of memory, and in true color mode have 256 shades (8-bits) available for each of three colors: red, green, and blue. By mixing them, the system can display 224 colors. Eight bits are used in each of the three color channels. Some monitors use traditional 15-pin cables, and some use BNC bayonet cables, with a separate cable for each RGB color, and one each for vertical and horizontal synchronization. The latter are found on many high-performance systems.
True-color cards originally sold for $3,000 or so, but within two years were under $800; now they are available for $150 or less. To add value, the better cards now have TV output ports that send a National Television Standards Committee (NTSC) signal that can be used to record images from the monitor onto a VCR or TV set. Multimedia cards are equipped with a TV tuner, letting the owner view TV programs on the monitor, or watch DVD (digital video disc) movies on a PC. One reason for the dramatic lowering of prices and added features stems from the mass production of the coprocessors, which reduced their cost to the manufacturer; another factor is the decreasing cost of video memory.
As mentioned earlier, the amount of memory on a display adapter is a major factor in determining the screen resolution and color depth that the card can manage. Just as with system RAM, the video memory must be able to operate at a speed that can keep up with the processor, and the demands of the system clock. If the display adapter is too slow at updating the image on the monitor, the user is left waiting or is presented with jerky mouse movements and keystrokes that appear in delayed bursts rather than as typed.
Early video cards used fast page-mode (FPM) DRAM, a series of chips that were basically the same as the RAM used on the early PC's motherboard. This memory form was fine for MDA and CGA cards, and even the 8514/A, but with the higher resolution, increasing pixel depth, and faster refresh rates of VGA displays and beyond, vendors sought improved memory models to get the most performance out of their video coprocessors.
Enter dual-ported memory in the form of VRAM (Video RAM). It can read and write to both of its I/O ports at the same time. It allows the processor to talk to the system bus and the monitor simultaneously: fast, but very expensive. VRAM showed up in the best cards, but vendors wanted a low-cost option as well. Some vendors just used FPM (fast page-mode) DRAM, leaving the user to discover that, at high resolution, the display was too slow for efficient operation. These cards did sell well in the low-end market, as they allowed the budget-minded user to operate in low-color modes for most tasks, switching only to higher color depth for projects that required high-color or true-color mode. Users who regularly worked in high-color or true-color mode often would quickly consider an upgrade.
An alternate is EDO (Extended Data Out) DRAM, which can begin reading a new set of instructions before the preceding set of instructions has been completed. This is a common form of system DRAM that boosts performance to about 15 percent above conventional DRAM.
WRAM (Window Random Access Memory, unrelated to the Microsoft operating system) is a high-speed variant of VRAM that costs less to produce and boosts performance by about 20 percent beyond regular VRAM. VRAM and WRAM have become the standard memory types for high-end display adapters.
The mid-range display market makes use of SGRAM (Synchronous Graphics RAM). As the name implies, it is tuned to the graphics-card market, offering faster transfers than DRAM, but not as fast as VRAM and WRAM.
Multibank DRAM (MDRAM) is the final stop on our tour of memory acronyms. It uses interleaving (the dividing of video memory into 32KB parts that can be accessed concurrently) to allow faster memory input/output to the system without expensive dual-porting. It is also a more efficient type of chip that is practical to produce in sizes smaller than a full megabyte. A vendor can save money by just buying the amount needed to actually draw the screen. This saves about 1.75 MB per card for a resolution of 1024 x 768.
The table below lists the standard memory requirements for the most common resolutions and pixel depths used today. As pointed out in the previous paragraphs, keep in mind that the minimum amount of memory for MDRAM is usually less than for other types of RAM. Some graphics cards offer additional memory, and even incorporate different types of RAM on the same card. In such cases, some of the memory might be used for features other than merely imaging the picture to be sent to the CRT in pixels.
|Minimum Memory Requirements for Common Display Resolutions|
|Screen Resolution||8-Bit (256 color)||16-Bit (65-KB color)||24-Bit (16.7 Million Color)|
|640 x 480||512 KB||1 MB||1 MB|
|800 x 600||512 KB||1 MB||2 MB|
|1024 x 768||1 MB||2 MB||4 MB|
|1280 x 1024||2 MB||4 MB||4 MB|
|1600 x 1200||2 MB||4 MB||6 MB|
Text-based adapters under MS-DOS don't need software drivers to interface between the operating system and the image on the screen. Windows, OS/2, and other graphics-rich environments do need drivers. In addition, controls are needed to adjust the refresh rate, resolution, and any special features the card offers. These needs are handled by the use of display drivers, a software layer that marries the card and monitor to the operating environment.
When installing a new card or operating system for a client, be sure to check the manufacturer's Web site for the latest display drivers. Not only will you reduce the likelihood of problems in using the new addition, but you will find that most new cards incorporate setup routines that can make quick work of getting a new display running.
The following points summarize the main elements of this lesson: