Fallout's Memory Model

Abstract

Problem: How did the original Fallout (1997) manage to run on machines with only 16MB of RAM while rendering Super VGA graphics and loading extensive game data?

Approach: After discovering a critical bug in the Watcom C compiler's realloc implementation, Tim Cain built a custom memory allocator that grabbed one large block at startup and managed all allocations internally using a handle-based system with locking and compaction.

Findings: The custom memory model eliminated memory fragmentation β€” the real culprit behind out-of-memory errors β€” and allowed Fallout to run on far lower-spec machines than would otherwise have been possible, directly expanding the game's potential market.

Key insight: Fallout wasn't running out of memory; it was running out of contiguous memory. A handle-based allocator with compaction solved this by letting the engine move unlocked blocks to defragment the heap on the fly.

Source: https://www.youtube.com/watch?v=6kB_fko6SIg

Context: PCs in the Mid-90s

Tim Cain sets the scene: the minimum spec for Fallout was 16MB of RAM. After the OS took its share, the game had even less. To put it in perspective, Fallout's 640Γ—480 screen takes up a chunk of memory β€” and 48 of those screens was all the memory the entire game had to work with. Code, data, art, everything.

The PC landscape was fragmented too β€” different manufacturers, different video cards. By Fallout's development, VESA drivers were common enough to assume, but memory remained brutally scarce. Super VGA was new and hungry.

The Watcom Compiler Bug

This story begins around 1994, during early engine development. Tim was getting memory overwrites β€” writing data to one location would corrupt data elsewhere. He traced the problem to realloc, one of C's standard memory functions.

The bug: Watcom's realloc would sometimes return a block of memory that overlapped with a block already allocated to someone else. Two separate allocations pointing to the same physical memory.

Tim emailed Watcom's customer support. They didn't believe him β€” understandably, since most "compiler bug" reports are actually programmer errors. So Tim wrote a minimal reproduction program:

  1. Allocate an array of memory blocks, storing each pointer and size
  2. Loop: pick a random block, realloc it to a new size
  3. After each realloc, verify that no blocks overlap
  4. If they do, print "memory integrity failed"

The program would fail within seconds. Tim sent it to Watcom. A few days later, they confirmed: the bug was real and a patch was incoming.

The Trust Problem

Even after the fix, a lingering issue remained: trust. Every subsequent bug raised the question β€” "is this my code, or is it the compiler?" Tim notes this wasn't unique to Watcom; developers using Microsoft's compiler had the same anxiety, especially around optimization flags that could introduce subtle bugs when compiling for speed or size.

The Custom Memory Model

Rather than continue relying on the standard library's memory management, Tim made a decisive architectural choice: Fallout would manage its own memory.

How It Worked

  1. One big malloc at startup. When Fallout launched, it called malloc exactly once and grabbed the largest block of memory it could. If the machine couldn't provide enough, it wasn't min-spec. After that, malloc was never called again.

  2. Internal allocator. When the game needed memory, it called custom functions that carved portions out of that single large block. The team reimplemented malloc, realloc, and free themselves.

  3. Handles instead of pointers. The game didn't receive direct pointers to memory. Instead, it received handles. To actually use the memory, you had to lock the handle, which returned a valid pointer. While locked, the pointer was guaranteed stable. When you were done, you unlocked it.

  4. Compaction. Any unlocked block could be moved β€” copied to a new location within the big block. Since nobody held a direct pointer to it (only a handle), this was safe. The engine could slide all unlocked blocks together, consolidating free space into one contiguous region.

Why This Mattered

The critical insight: Fallout wasn't actually out of memory most of the time. It was out of contiguous memory. A request for 1,000 bytes would fail even though there were 500 bytes free here and 500 bytes free there β€” classic memory fragmentation.

The handle-and-compaction system eliminated this problem. By defragmenting on the fly, Fallout could use nearly all of its available memory effectively. This had direct business impact:

  • The game ran on lower-spec machines than it otherwise could have
  • This meant more potential customers β€” something Interplay and marketing cared deeply about
  • Memory-related crashes and failures dropped dramatically

Then vs. Now

Tim contrasts this with modern development. Today, operating systems use virtual address spaces and page tables. When a program requests memory, the OS provides a virtual address that may not even correspond to physical RAM yet β€” it gets mapped on access. A program might think it's using a gigabyte while actually occupying far less physical memory.

None of this existed for DOS-era games. Developers were working "right above the metal," and shipping a game required deep expertise in optimization and low-level code. Tim notes that he could rebuild Fallout today in Unity much faster and with a smaller team β€” but the skill set required in the 90s was fundamentally different, not lesser.

Source

Fallout's Memory Model β€” Tim Cain (YouTube)