In The Eyes of 1998 (Part 2)

Following from Part 1, Part 2 will be looking at graphics and hand-held devices based on what was covered in an 1998 issue of PC Magazine.

3-D Computing

3D graphics as we know it was still very much in its infancy in 1998. 3Dfx Interactive was one company offering such capabilities to the desktop PC. Early that year their release of the Voodoo2, proved to become very popular. A PCI-based dedicated 3D graphics card with 8 or 12 MB of memory, it had been frequently used as a benchmark for comparisons in performance. The Nvidia RIVA 128 and ATI 3D Rage Pro cards were in direct competition, and were capable of both 2D and 3D graphics on the one card. However such cards had a tendency to provide mixed results in 3D use, although they were early adopters of the then-new AGP standard. 3Dfx responded with the Banshee, but with a reduction in texture mapping capability compared to the Voodoo2.


The rear of the retail box for the Voodoo2 by 3Dfx. Creative also had their own Voodoo2-based card in their own packaging.

Intel released their first graphics processing unit, the i740, a veiled attempt at promoting AGP video. Before release it was touted as being a serious challenger to budget brands, an S3 “killer” as it were.

Microsoft released DirectX 6.0, offering several new features such as a new Direct3D framework, texture map compression, and range based fog.

write_iconDownload DirectX 6.0 feature article (by Ron Fosner, Microsoft Systems Journal)



Half-Life (1998) required DirectX 6.0 and had octagon shaped tyres! Image source:

From the processor front Intel had already released their Pentium MMX range. These were updates to the original Pentium operating at lower voltages, had double the Level 1 cache at 32 KB, and provided the MMX instruction set comprising of 57 instructions.

The MMX instruction set was specifically for applications that relied heavily on encoding and decoding audio and video, whether it be in gaming or content creation. Software that took advantage of MMX was slow going, though Adobe for instance released what was known as MMX Technology Plug-ins for Premiere 4.2. AMD with their K6-2 processors went with an instruction set known as 3DNow! that focussed on vector processing consisting of 24 instructions.

write_iconDownload AMD’s 3DNow! Technology Manual


2001 – The prediction naturally was to expect more advanced graphics capabilities. Based on Moore’s Law which dictated a doubling of performance every 18 months, 2001 would see performance approximately four times higher than in 1998. This was to be aided by the introduction of DDR DRAM video memory. Memory size would be 8 MB as a minimum with 16 MB typically seen. Budget orientated PCs would see an influx with 2D/3D integrated graphics on their motherboards, while mid to high-end PCs would continue using dedicated AGP cards.

Realism in gaming would be improved with anti-aliasing to remove jagged lines, improved texture mapping for surfaces (e.g. a brick wall), and particle system acceleration for effects such as smoke and rain. Still it was questionable as to how these enhancements would ever be relevant to usual productivity applications and so forth.

Verdict – Starting with memory size the amounts were to an extent underestimated. Entry-level systems would see 8 MB Intel graphics either integrated or as a dedicated card, while a laptop may have an ATI Rage Mobility 128 integrated. While still a jump from the 2 and 4 MB video cards of 1998, the reputation gained was that these were best suited to 2D application use. Systems with integrated graphics would see a chunk of the system’s memory utilised, reducing performance.

The Nvidia TNT2 M64 comprising of 16 or 32 MB was popular in the OEM market for mid-range systems, though were starting to show their age against the then-new Nvidia GeForce2 or ATI Radeon which sported either 32 or 64 MB of memory.

3Dfx had already begun their downward trajectory. The release of the Voodoo3 was considered underwhelming, and by the time the Voodoo4 and Voodoo5 were released in 2000, the company had been filed for bankruptcy. In the process, Nvidia acquired what was left over of the company.

Intel’s i740 was by and large considered a commercial failure by the second half of 1999; just not particularly competitive enough to rivals for its time. Later that year the Intel i810 chipset also designed for low-cost computing, had incorporated an integrated graphics processor that was based upon the i740. In 2001 initially for the mobile market with Pentium III laptops, saw the release of Intel Extreme Graphics, which despite the name only offered modest improvements. For mobiles it was either the 830M or the lower cost 830MG chipset, while the desktops saw several variations of the 845 chipset. These were to be seen on Pentium 4-based motherboards. Nevertheless, Intel graphic chipsets did not set the world on fire.

DirectX was up to version 8.1, included with the inital release of Windows XP offering new specifications in relation to pixel shaders.

Battlefield 1942 (2002) required DirectX 8.1. Tyres also improved, as a dodecagon. Image source:

Released in 1999 with the Pentium III processor, Intel’s SSE (Streaming SIMD Extensions) had been updated to SSE2 in 2001. SSE originally contained 70 instructions mostly to work with single precision floating point data, such as when processing graphics. SSE2 added an additional 144 instructions offering double precision floating point, and performance enhancements over MMX. AMD added the SSE instructions into their Athlon XP processors, while it wouldn’t be until 2003 with the release of their Athlon 64 and Opteron processors to see SSE2 introduced.


Apart from two instructions out of the original 24, PREFETCH and PREFETCHW, it was announced in 2010 that 3DNow! would no longer be incorporated into AMD’s processors citing Intel’s SSE instruction set series made much of 3DNow! redundant.


Mainstream Graphics

This section of the article in PC Mag wasn’t about 2D graphics cards that were still around in 1998, but the use of 3D graphics outside of gaming environments, such as in data analysis and user interfaces. One such example given was around real-time analysis of the world’s air traffic – being able to spin around the globe and zooming in on particular areas of interest. Google Earth instantly sprung to mind in resemblance, though this was a time when Google had barely left operating out of a garage in California.

Interaction with the Windows operating system has historically been in 2D. The use of 3D was restricted to its own window, known as a DirectDraw Surface. This meant there was no ability to blend outside into a 2D application as such. DirectX was seen as a part of the jigsaw puzzle to make this happen.

A core Windows component know as the GDI (Graphics Device Interface) for getting anything to appear on your screen, was also expected to be redesigned, offering to merge 2D and 3D in a more seamless fashion. At the time, GDI utilised RGB (red, green, and blue) channels, though it was suggested adding an additional channel for translucency with allowances for 3D effects on windows. This is really just going from 24- to 32-bit colour palettes, though the mention of translucent windows gives a hint to the Aero interface introduced with Windows Vista almost ten years later.

A joint venture project between Microsoft and Silicon Graphics Inc (SGI), known as Fahrenheit was aimed to merge the OpenGL and Direct3D APIs. It was expected a set of developer tools to utilise Fahrenheit would appear by mid-1999, and to be fully incorporated into DirectX by 2000.

2001 – PC Mag didn’t offer anything specific to expect by the year 2001; apart from updated XML and Dynamic HTML specifications for the purposes of web graphics.

Verdict – It’s fair to say that over the following three years, nothing really changed in the use of 3D.

Microsoft Research, a subsidiary of Microsoft however had developed what was known as the Task Gallery, a 3D window manager. It ran on a modified version of Windows 2000, and was an experiment on spacial cognition and alternatives to the desktop metaphor. Effectively the notion was walking into an art gallery, with each ‘painting’ grouping related tasks together that could be easily manipulated. It didn’t get past the laboratory in the end.

Recorder_IconMicrosoft Research video explaining their 3D Task Gallery.


write_iconThe research paper on the design and usability of the Task Gallery.


Another project that only went so far was Fahrenheit. Announced in 1997 as an important collaboration between Microsoft and SGI, it was well and truly dismissed before 2001 even arrived. Between the two companies, the project’s success was more important for SGI, who experienced dwindling sales of their dedicated workstations. Microsoft on the other hand preferred to focus on releasing DirectX 7.0 with further development on Direct3D, with minimal resources if any working on Fahrenheit. Besides, Direct3D was favoured over OpenGL by this stage.

Windows Vista released in 2006 introduced the Aero interface, requiring a DirectX 9.0 compatible video card. Translucent effects on windows, and a new feature known as Flip 3D for switching between applications were subtle graphical enhancements, though compared to the 3D Task Gallery didn’t drastically change how we went about using Windows.

Also in 2006 saw the development of BumpTop originating from a group of graduates from the University of Toronto. It has since been acquired by Google and became open source. Despite the use of 2D icons, it offers one of the closest experiences to having a 3D desktop, and supports finger gestures on touch screens.

Apple’s Mac OS X provided the Aqua user interface, replacing Platinum used in Mac OS 8 and 9. Depending on the release various levels of translucency and transparency effects were used. 3D effects were relegated to the Dock for launching applications. In 2008 before the time Mac OS X Snow Leopard was released, several patents were made by Apple for a “Multi-Dimensional Desktop”. Although there were variations to the design of the user interface amongst the patents, the use of layers and depth were the common theme to convey a 3D desktop.


One of Apple’s patent designs offering depth to the desktop.

Now that virtual reality is becoming more common place on the PC for gaming, it will be interesting to see how this changes the way we use operating systems and productivity applications in the future.

Hand-Held Devices

Hand-held devices frequently known as PDAs (personal digital assistants) or hand-held PCs have more or less been around as long as laptops. One such early hand-held, though not the first was the Poqet PC model PQ-0164 announced in October 1989. Costing $1,995 US and weighing one pound (nearly half a kilogram), it housed an 80C88 processor running at 7 MHz, 512 KB of RAM, and a non-backlit CGA-compatible LCD panel. All of this was powered with two standard AA batteries. For software it ran MS-DOS 3.3, and applications such as Lotus 1-2-3 version 2.2 and WordPerfect 5.1 were available via the use of PCMCIA ROM cards.


The Poqet PQ-0164 with memory expansion and Lotus-123 ROM cards. Image source:

By the time it was March 1998, the then current device to have was the Palm III by Palm Computing, a division of 3Com. Compared to the Poqet PC, the price had greatly reduced down to $400 US upon release. Its specifications consisted of a Motorola DragonBall 16 MHz processor, 2 MB of RAM, and running Palm OS 3.0. This particular model introduced infrared support for transferring data to another Palm III.

Alternatively, a more expensive option was a device running Windows CE 2.0, an operating system designed for hand-held and embedded systems. One such device was the Hitachi Persona HPW-200JC released November 6, 1998 for the Japanese market. Priced considerably higher than the Palm III at around 138,000 yen (JPY), this equated to approximately $1,168 US or $1,839 AUD based on exchange rates on that day. As expected, the hardware was more powerful than the Palm III, albeit almost weighed double that of the Poqet PC. Running at 100 MHz with Hitachi’s own 32-bit RISC processor, included was 16 MB of RAM (upgradable to 48 MB), an 8.1″ 256-colour 640 x 240 resolution screen, and even a 33.6 Kbps modem for dial-up internet access.

Windows CE 2.0 very much replicated the Windows 95 user interface, though modified for the reduced screen real estate. An advantage over Palm OS likely to have been useful in the corporate word was integration with the Microsoft Office suite, synchronising  with Outlook on your office PC, and having the so-called Pocket editions of Word, Excel, and PowerPoint. Palm OS on the other hand wasn’t attempting to replicate features found on your PC, emphasising on simplicity.


A May 1998 issue of PC Mag explained how OROM storage was to work.

For such devices at the time, the main shortcomings were battery life and wireless communications. Battery life could range from several hours to around a month depending on the model and usage patterns. Personally if a battery for such a device lasts for a few weeks, in hindsight I feel that’s reasonable based on past experiences playing with Nintendo Gameboys. In regards to connectivity, apart from connecting to your PC using a serial cable for data transfer, or using infrared to other devices if available, you were left with add-on attachments connecting a modem for dialup internet access.

Optical read-only memory (OROM) was seen as a way to increase storage capabilities for Windows CE and Java-based devices. Ioptics, a start-up company from Bellevue, Washington, were set to introduce data cards with 128 MB of storage capacity. These were to be read with a designated external card reader, with 10 millisecond access speeds, equivalent to a typical hard disk at the time. A second generation of data cards would increase the capacity to 320 MB. This grabbed the attention of Microsoft, and expected out on the market by “summer” of 1998.

2001 – Similar to the Mainstream Graphics section of the article, details were vague as to what to see in three years. It was expected that hand-held devices could go for several weeks between battery charges, regardless of attaching a modem which at the time would cause a further drainage on power. The majority of third-party software development was aimed for Palm OS, so the attitude was the momentum for Windows CE needed to continue, to prevent stagnation and eventual dismissal of the platform.

Verdict – By 2001, both Palm and Windows CE-based devices still experienced steady sales. Market share favoured Palm by a long margin however who had just released their m500 series. Notable improvements over its predecessors were the introduction of Motorola’s 33 MHz Dragonball VZ processor, 8 to 16 MB of RAM, 16-bit colour display, and support for SD and MMC storage cards. Preinstalled on the m500, was Palm OS 4.0. Compared to version 3.0 this offered support for the m500’s newer hardware components (e.g. support for colour screens), Palm’s Universal Connector for use with accessories and charging via a USB port, and integration of the Mobile Internet Kit that was sold separately for older models.


Palm’s M515 in German.

From the year 2000 Windows CE was up to version 3.0 and was the start of seeing a greater shift to devices more closely aligned with Palm’s. There was the ‘standard’ iteration of CE 3.0, similar to 2.0 though with improved stability and resembled the appearance of Windows 98.

Also based on CE 3.0 was an offshoot known as Pocket PC 2000 released in April 2000. Pocket PC had a dramatically different user interface, designed for devices that resembled more closely with what Palm was offering, for looking up your address book and calendar appointments. 18 months later and Pocket PC 2002 was the subsequent version,  aligning with Windows XP aesthetically. From then on, Pocket PC was rebranded as Windows Mobile (now Windows Phone) tailored towards smartphone devices, albeit with low market share.

Literally within the first few days of 2002 Windows CE 4.0 became available. Similar to Pocket PC 2002, it was also given the Windows XP makeover but retained the more traditional desktop interface. It offered improved security such as the use of Kerberos authentication over a network, Bluetooth support, and introduced the .NET Compact Framework for applications development.

Originally for $1,099 US was the Casio Cassiopeia E-200 running Pocket PC 2002. Weighing only 190 grams the hardware was more substantial to the 1998 Hitachi, offering Intel’s 200 MHz StrongARM 1110 processor, 64 MB of RAM, and 16-bit 240 x 320 colour display. It marked the end of Casio producing models to the western world however, who returned back to the Japanese domestic market.

The company Ioptics who promised to commence production of their 128 MB storage cards failed to come to fruition. All I found was references to Ioptics planning on releasing such cards in 1998, but nothing following up as to the outcome. It would seem likely the company was either acquired or folded up, never to be heard of again. Nevertheless, the advent of SD and MMC cards took over storage concerns.

In the end production of such devices were in decline towards the late 2000s. Similar to what happened with digital cameras and MP3 music players, smartphones reached a point that additional devices weren’t required.

In Part 3, the series continues with home automation and the transitional period for displays.



Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s