Technological Evolution

Abstract

Problem: How did the relentless pace of technological change affect game development across four decades, from 1981 to the present?

Approach: Tim Cain walks through his personal experience shipping games across every major hardware and software transition β€” from text-mode CGA through VGA bank-switching, VESA standardization, the 2D-to-3D revolution, OS churn, engine adoption, and language shifts.

Findings: Every few years, foundational technology changed so dramatically that it invalidated old approaches, demanded new skills (assembly β†’ C β†’ C++ β†’ C#), and forced developers to constantly relearn their craft. Unlike film or publishing, where core techniques remain stable for decades, game development is defined by perpetual technological upheaval.

Key insight: Technological evolution in games never ends and no one is immune β€” embrace it, because constant learning is both the stress and the joy of the industry.

Source: https://www.youtube.com/watch?v=rr61gn2k46o

The Consumer Experience vs. The Developer Experience

Tim opens by distinguishing two sides of technological evolution. As a consumer, the rapid pace β€” from handheld Coleco games to Pong to Atari VCS to Nintendo β€” was exciting but financially punishing. You'd buy a PC and it would be obsolete within months as better sound cards, video cards, and processors appeared.

But Tim's focus is on the developer side, starting from 1981: making games where your target platform was constantly shifting. This is fundamentally different from film or TV, where directing techniques have remained largely the same for 20-40 years. In games, every technological change could break old games, open up new possibilities, and demand entirely new approaches.

The Graphics Hardware Gauntlet

CGA, EGA, VGA: The Early Modes

Tim's first game, Grand Slam Bridge (~1983-1986), was entirely text-based, using ASCII characters. It worked on any PC and still works today β€” an elegant sidestep of hardware compatibility.

When he returned to game development in 1991 with Bard's Tale Construction Set, he had to support three graphics modes simultaneously:

  • CGA: 4 predefined colors (black, white, magenta, and one other). Two fixed palettes, no choice. "Good luck making the graphics you want."
  • EGA: 16 colors
  • VGA: 320Γ—200 with 256 colors β€” the "new hot thing"

Super VGA and Bank Switching Hell

By 1994 when Fallout began development, resolution had jumped to 640Γ—480 with 256 colors β€” nearly five times the pixels of standard VGA. But this "Super VGA" mode required bank switching: the PC could only access 64KB of video memory at a time, so the screen was divided into banks. Drawing a pixel required knowing which bank that screen line fell in and switching to it.

The nightmare: every video card bank-switched differently. Developers had to write assembly code for each chipset variant (Trident 9600 chipset A, B, C β€” all different). The only options were:

  1. Ask the user what video card they had (they often didn't know)
  2. Auto-detect (unreliable)
  3. Guess and pray ("I'm going to try to display a screen β€” pray you don't crash")

VESA: Standardization Arrives

VESA (Video Electronics Standards Association) eventually standardized video access. By the time Arcanum shipped in 2001 at 800Γ—600 with 256 colors, VESA-compliant drivers handled bank switching automatically. You'd just say "I want 800Γ—600, 256 colors" and the driver handled the rest. A massive quality-of-life improvement for developers.

The Mac Detour

Fallout was Tim's one and only Mac game. The reason he never went back: every new Mac OS release broke backward compatibility. Old software simply stopped working. Eventually, people found it easier to play the Windows version of Fallout under emulation on a Mac than to run the native Mac version on a newer Mac OS. Tim's blunt assessment: Apple showed very little interest in games, so he returned the favor.

The Windows Version Treadmill

During Arcanum's development alone, the team had to test across: Windows 95, Windows NT, Windows 98, Windows 98 Second Edition, Windows 2000, Windows ME, and Windows XP (which shipped right as the game did). Each version needed verification β€” not just "does it start" but frame rate, stability, and API compatibility. Windows released a new version of XP every year from 2001 through 2005 when Troika shut down.

The 2D to 3D Revolution

Temple of Elemental Evil: The Hybrid

Temple of Elemental Evil was Tim's first game with 3D graphics β€” but only partially. Backgrounds were still 2D, with a 3D collision mesh and 3D characters rendered through the video card's 3D API. Every video card handled 3D slightly differently, typically accessed through DirectX.

Vampire Bloodlines: Full 3D

Just a couple years later, Vampire: The Masquerade – Bloodlines was fully 3D. This was "a whole new way of making a game" β€” suddenly you worried about polygon counts, texture sizes, and rendering concerns that didn't exist in 2D (where everything was pre-rendered into sprites).

The Math Wall

The shift to 3D created a hard skill barrier. In 2D, you could get by in the game industry without much math. In 3D, linear algebra became mandatory β€” vectors, matrices, quaternions. Artists, programmers, and level designers all needed it. Tim's engineering background gave him a head start, but he knew programmers who simply stayed in 2D for the rest of their careers rather than make the leap.

Programming Languages Across a Career

Tim's language journey mirrors the industry's evolution:

  • Grand Slam Bridge (1983): Written in C (possibly assembly)
  • Every game through Arcanum/Temple of Elemental Evil: C
  • Vampire: Bloodlines (~2001-2004): First game in C++
  • C++ continued through to The Outer Worlds (Unreal Engine)
  • Pillars of Eternity (Unity): C# β€” a brief detour
  • Then back to C++ for Unreal

The Engine Question

Tim built his own engines from 1981 through 2011, with one exception: Vampire Bloodlines used Valve's Source engine (2001-2004).

At Obsidian, things changed. The studio had proprietary engines β€” South Park was built on an engine originally made for Dungeon Siege 3 ("the two games don't look at all alike"). Then came Unity for Pillars of Eternity, where you had no source code access and could only do what Unity allowed. Then Unreal for The Outer Worlds β€” a whole new engine, language (back to C++), and workflow. You had source access but were responsible for merging your changes with every engine update. "Programmers know what I'm talking about."

The Takeaway: It Never Ends

Tim's summary is characteristically direct: technological evolution has been constant from his first day in 1981 through 2024 and beyond. No technology lasts forever β€” not Unity, not Unreal, nothing. New engines will emerge, old ones will be displaced or change so dramatically between versions that you have to relearn everything.

No discipline is immune. Even narrative designers face it: first just text, then voiceover, then lip-syncing β€” each adding new constraints and skills.

Tim's advice: embrace it. The constant learning is simultaneously the most exciting and most stressful part of game development. "That's what this industry is all about β€” technological evolution."

References