When Having 8-bit Colour Was Good

Monitors-1Since around the year 2000, LCD (liquid-crystal display) and later LED (light-emitting display) monitors gradually became the de facto standard replacing the earlier CRT (cathode ray tube) technology available. Over the years the size and resolutions of the monitors have increased significantly. These days when selecting one to buy it’s really just about choosing the desired screen size and whether it’s capable of 1080p or 4K resolution. No consideration for colour depth is required for instance.

Of course, during the 1980s and 1990s this wasn’t the case. A series of acronyms were used to give an indication of the monitor’s capabilities, and were frequently stated. Some of these acronyms were EGA and VGA. Over a number of years now low-end or built-in video adapters in motherboards, have no problem with displaying millions of colours on a 1920 x 1080 resolution screen. However, in the past it was more pertinent to ensure that the video adapter was capable enough to handle the monitor’s capabilities or vice versa.

In this post, I’ll be covering from the CGA and MDA standards of the early 1980s through to XGA in the 1990s used by IBM and compatible PCs. Before closing out, they’ll be mention of some of the other standards that became available. PCem is used for many of the screenshots due to the lack of physical hardware.

Continue reading

Advertisements